Hands-on Scala Learning for Enterprise AI
Learn AI by Working With Our World-Class Advanced AI Computing Lab for Free.
Reinvent traditional software engineering through our enterprise AI platform. Avoid instruction-driven hardcoded
automation and transform into the machine self-learning application for the digital era.
Be Empowered by Our School of Artificial Intelligence and Enabled Through Our Intelligent Automation
Learn Enterprise AI and Enterprise Data Science With Scala
Acquire Cross-Functional Skills from Business to Data to Technology by Mastering AI
Learning Path

Learn How Our School of AI Teaches Workers to Transfer Business Processes to the Modern Digital World

Choose Your Hands-On Learning Unit
Develop an AI Application Using our Pre-Built Scala Libraries. Coding is No Longer Necessary
Master Artificial Intelligence with Hundreds of
Predeveloped Scala Programs
- Unit 1Scala Fundamentals
- Program #1Hellow World
- Program #2Variable
- Program #3Type Integer
- Program #4Type Floating Point
- Program #5Type String
- Program #6Complex Numbers
- Program #7Function Defining
- Program #8Dynamic Typing
- Program #9Static Typing
- Program #10Boolean Evaluation
- Program #11If Statement
- Program #12If Else Statement
- Program #13Else If Statement
- Program #14Block Structure
- Program #15Whitespaces
- Program #16Regular Expressions
- Program #17Lists
- Program #18Tuple
- Program #19Sets
- Program #20Frozen Sets
- Program #21Collection Transition
- Program #22Loop Else
- Program #23Arguments
- Program #24Mutable Arguments
- Program #25Accepting Variable Arguments
- Program #26Unpacking Argument List
- Program #27Scope
- Program #28Error Handling
- Program #29Namespaces
- Program #30File Input Output
- Program #31Higher Order Functions
- Program #32Anonymous Functions
- Program #33Nested Functions
- Program #34Closure
- Program #35Lexical Scoping
- Program #36Operator
- Program #37Decorators
- Program #38List Comprehensions
- Program #39Generator Expressions
- Program #40Generator Functions
- Program #41Itertools Chain
- Program #42Itertools Izip
- Program #43Debugging Tool Print
- Program #44Classes
- Program #45Emulation
- Program #46Class Method
- Program #47Static Method
- Program #48Inheritance
- Program #49Encapsulation
- Program #50N-Dimensional Array
- Program #51Reading Writing Data
- Program #52Reading Data From CSV
- Program #53Normalizing Data
- Program #54Formatting Data
- Program #55Controlling Line Properties Matplotlib
- Program #56Plotting Simple Function
- Program #57Importing Module
- Program #58Creating Module
- Program #59Graphing Matplotlib Using Defaults
- Program #60Graphing Matplotlib Using Defaults Changing Colors
- Program #61Graphing Matplotlib Using Defaults Setting Limits
- Program #62Graphing Matplotlib Using Defaults Setting Ticks
- Program #63Graphing Matplotlib Using Defaults Setting Tick Labels
- Program #64Graphing Matplotlib Using Defaults Moving Spines
- Program #65Graphing Matplotlib Using Defaults Adding Legends
- Program #66Graphing Matplotlib Using Defaults Annotating Points Legends
- Program #67Data Manipulation Using Pandas
- Program #68Reading Data From Hana To Python
- Program #69Reading Writing Data
- Program #70Reading Data From CSV
- Program #71Normalizing Data
- Program #72Formatting Data
- Program #73Controlling Line Properties Matplotlib
- Program #74Plotting Simple Function
- Program #75Importing Module
- Unit 2Big Data Processing
- Program #76Creating Module
- Program #77Graphing Matplotlib Using Defaults
- Program #78Graphing Matplotlib Using Defaults Changing Colors
- Program #79Graphing Matplotlib Using Defaults Setting Limits
- Program #80Isotonic Regression Scikit Learn
- Program #81Neural Networks Scikit Learn
- Program #82Non Linear Svm Scikit Learn
- Program #83Decision Trees Scikit Learn
- Program #84Plotting Validation Curve Scikit Learn
- Program #85Loading Datasets Scikit Learn
- Unit 3Machine Learning: Part 1
- Program #86Mean Shift Clustering Algorithm Scikit Learn
- Program #87Affinity Propagation Clustering Algorithm Scikit Learn
- Program #88Dbscan Clustering Algorithm Scikit Learn
- Program #89Kmeans Clustering Algorithm Scikit Learn
- Program #90Spectral Bi Clustering Algorithm Scikit Learn
- Program #91Spectral Co Clustering Algorithm Scikit Learn
- Program #92Ridge Regression Scikit Learn
- Program #93Scientific Analysis Arrays Numpy
- Program #94Scientific Analysis Arrays Reshaping Numpy
- Program #95Scientific Analysis Arrays Concatenating Numpy
- Program #96Scientific Analysis Arrays Adding New Dimensions Numpy
- Program #97Scientific Analysis Arrays Initializing With Zeros Ones Numpy
- Program #98Scientific Analysis Mgrid Scipy
- Program #99Scientific Analysis Polynomial Scipy
- Program #100Scientific Analysis Vectorizing Functions Scipy
- Unit 4Machine Learning: Part 2
- Program #101Scientific Analysis Select Function Scipy
- Program #102Scientific Analysis General Integration Scipy
- Program #103Time Series Analysis Pandas
- Program #104Exporting Data Using Pandas
- Program #105Importing Data Using Pandas
- Program #106Data Analysis Pandas
- Program #107Empty Graph Networkx
- Program #108Graph Adding Nodes Networkx
- Program #109Graph Adding Edges Networkx
- Program #110Graph Display Networkx
- Program #111Graph Path Networkx
- Program #112Renaming Nodes Networkx
- Program #113Example Pymc
- Program #114L1 Based Feature Selection
- Program #115Line Plot Matplotlib
- Program #116Dot Plot Matplotlib
- Unit 5Predictive Analytics
- Program #117Numeric Plot Matplotlib
- Program #118Figures Axes Matplotlib
- Program #1192D Plotting Mapplotlib
- Program #1202D Plot Scikit Learn
- Program #121Classification Scikit Learn
- Program #122Model Selection Scikit Learn
- Program #123Nearest Neighbours Regression Scikit Learn
- Program #124Graphing Matplotlib Regular Plot
- Program #125Graphing Matplotlib Scatter Plot
- Program #126Graphing Matplotlib Bar Plot
- Program #127Graphing Matplotlib Contour Plot
- Program #128Graphing Matplotlib Imshow
- Program #129Graphing Matplotlib Pie Chart
- Program #130Graphing Matplotlib Quiver Plot
- Program #131Graphing Matplotlib Grids
- Program #132Graphing Matplotlib Multi Plots
- Unit 6Advanced Machine Learning
- Program #133Graphing Matplotlib Polar Axis
- Program #134Graphing Matplotlib 3D Plot
- Program #135Graphing Matplotlib Texts
- Program #136Histogram Matplotlib
- Program #137Bed Occupancy Optimization
- Program #138Life Time Value Customer Prediction
- Program #139Customer Upselling Characteristics Prediction
- Program #140Sales Lead Priortization
- Program #141Inventory Demand Forecasting
- Program #142Credit Card Fraud Risk
- Program #143Employee Churn Prediction
- Program #144Patient Medication Complaince Prediction
- Program #145Physician Attrition Prediction
- Program #146Patient Readmittance Rate Prediction
- Unit 7Data Visualization
- Program #147Patient Insurance Claim Prediction
- Program #148Drug Demand Forecasting
- Program #149Customer Retention Analysis
- Program #150Hospital Bed Turn Analysis
- Program #151Patient Survival Analysis
- Program #152Patient Medication Effectiveness Analysis
- Program #153Sales Growth Analysis
- Program #154Customer Cross Selling Analysis
- Program #155Product Customer Segmentation
- Unit 8Use Case Implementation: Part 1
- Program #156Employee Talent Mangement
- Program #157Patient Bed Occupancy
- Program #158Product Market Basket Analysis
- Program #159Automobile Claims Handling Analysis
- Program #160Customer Market Share
- Program #161Data Collection From Excel
- Program #162Data Collection From CSV
- Program #163Data Collection From Clipboard
- Program #164Data Collection From HTML
- Program #165Data Collection From XML
- Unit 9Use Case Implementation: Part 2
- Program #166Data Collection From JSON
- Program #167Data Collection From PDF
- Program #168Data Collection From Plain Text
- Program #169Data Collection From DOCX
- Program #170Data Collection From HDF
- Program #171Data Collection From Image
- Program #172Data Collection From MP3
- Program #173Data Collection From SAP HANA
- Program #174Data Collection From Hadoop
- Program #175Data Integration Concatenate
- Program #176Data Integration Merge
- Program #177Data Integration Join
- Program #178Data Mapping Dictionary Literal Values
- Program #179Data Mapping Dictionary Operations
- Program #180Data Mapping Dictionary Comparision Operations
- Program #181Data Mapping Dictionary Statments
- Program #182Data Provisioning Extraction
- Program #183Data Provisioning Transformation
- Unit 10Advanced Data Processing
- Program #184Data Provisioning Loading
- Program #185Iterators
- Program #186Generator Expressions
- Program #187Generators
- Program #188Bidirectional Communication
- Program #189Chaining Generators
- Program #190Decorators
- Program #191Decorators As Functions
- Program #192Decorators As Classes
- Program #193Decorators Copying Docstring Other Attributes
- Program #194Example Standard Library
- Program #195Depriciation Of Functions
- Program #196While Loop Removing Decorator
- Program #197Plugin Registration System
- Program #198Context Managers
- Program #199Context Managers Catching Exceptions
- Unit 11Advanced Programming
- Program #200Context Managers Defining Using Generators
- Program #201Ndarray
- Program #202Ndarray Block Of Memory
- Program #203Ndarray Data Type
- Program #204Ndarray Indexing Scheme Strides
- Program #205Ndarray Slicing With Integers
- Program #206Ndarray Transposing With Integers
- Program #207Ndarray Reshaping Integers
- Program #208Ndarray Broadcasting
- Program #209Ndarray Universal Function
- Program #210Ndarray Generalized Universal Function
- Program #211Ndarray Old Buffer Protocol
- Program #212Processing Opening Writing To Image
- Program #213Image Processing Displaying Images
- Program #214Image Processing Displaying Images Basic Manipulation
- Program #215Image Processing Geometrical Transformations
- Program #216Image Processing Filtering Blurring
- Program #217Image Processing Filtering Sharpening
- Program #218Image Processing Filtering Denoising
- Program #219Image Processing Filtering Apply Gaussian Filter
- Program #220Image Processing Filtering Apply Median Filter
- Program #221Image Processing Feature Extraction Edge Detection
- Program #222Image Processing Feature Extraction Segmentation
- Program #223Tensorflow Hello World
- Program #224Tensorflow Tensors
- Program #225Tensorflow Fixed Tensors
- Program #226Tensorflow Sequence Tensors
- Program #227Tensorflow Randon Tensors
- Program #228Tensorflow Constants
- Program #229Tensorflow Variables
- Program #230Tensorflow Placeholders
- Program #231Tensorflow Graphs
- Program #232Tensorflow Session
- Program #233Tensorflow Feed Dictionary
- Program #234Tensorflow Data Type
- Program #235Tensorflow Add Two Consatnts
- Program #236Tensorflow Multiply Two Consatnts
- Program #237Tensorflow Matrix Inverse Method
- Program #238Tensorflow Queues
- Program #239Tensorflow Saving Variables
- Program #240Tensorflow Restoring Variables
- Program #241Tensorflow Tensorboard
- Program #242Tensorflow Namescope
- Program #243Tensorflow Linear Regression
- Program #244Tensorflow Logistic Regression
- Program #245Tensorflow Random Forest
- Program #246Tensorflow Kmeans Clustering
- Unit 12Working with Tensorflow
- Program #247Tensorflow Linear Support Vector Machine
- Program #248Tensorflow Non Linear Support Vector Machine
- Program #249Tensorflow Multi Class Support Vector Machine
- Program #250Tensorflow Nearest Neighbours
- Program #251Tensorflow Neural Networks
- Program #252Tensorflow Convolutional Neural Networks
- Program #253Tensorflow Deep Neural Networks
- Program #254NLP Installing Nltk
- Program #255NLP Count Word Frequency Nltk
- Program #256NLP Remove Stop Words
- Program #257NLP Tokenize Text Nltk
- Program #258NLP Tokenize Non English Text Nltk
- Program #259NLP Get Synonyms Nltk
- Program #260NLP Get Antonyms Nltk
- Unit 13Working with NLP
- Program #261NLP Word Stemming Nltk
- Program #262NLP Non English Word Stemming Nltk
- Program #263NLP Lemmatizing Words Nltk
- Program #264NLP Part Of Speech Tagging Nltk
- Program #265NLP Chinking Nltk
- Program #266NLP Chunking Nltk
- Program #267NLP Corpora Nltk
- Program #268NLP Named Entity Recognition Nltk
- Program #269NLP Text Classification Nltk
- Unit 14Working with Computer Vision
- Program #270NLP Converting Words To Features Nltk
- Program #271NLP Naive Bayes Classifier Nltk
- Program #272NLP Save Classifier Nltk
- Program #273NLP Scikit Learn Algorithms Nltk
- Program #274NLP Combining Algorithms Nltk
- Program #275NLP Noise Removal Nltk
- Program #276NLP Noise Removal Regular Expressions Nltk
- Program #277NLP Object Standardization Nltk
- Program #278NLP Topic Modelling Nltk
- Program #279NLP Ngrams Nltk
- Program #280NLP Tfidf Vectorizer_nltk
- Program #281NLP Word Embedding Nltk
- Program #282NLP Text Matching Levenshtein Distance Nltk
- Program #283NLP Cosine Similarity Nltk
- Program #284NLP Wordnet Nltk
- Unit 15Working with RPA
- Program #285Computer Vision Install Opencv
- Program #286Computer Vision Reading Images Opencv
- Program #287Computer Vision Displaying Images Opencv
- Program #288Computer Vision Writing Images Opencv
- Program #289Computer Vision Color Space Opencv
- Program #290Computer Vision Thresholding Opencv
- Program #291Computer Vision Finding Contours Opencv
- Program #292Computer Vision Image Scaling Opencv
- Program #293Computer Vision Image Rotation Opencv
- Program #294Computer Vision Image Translation Opencv
- Program #295Computer Vision Image Edge Detection Opencv
- Program #296Computer Vision Image Filtering Opencv
- Program #297Computer Vision Image Filtering Blurring Opencv
- Program #298Computer Vision Image Filtering Blurring Gaussian Blur Opencv
- Unit 16Working with Deep Learning
- Program #299Computer Vision Image Filtering Blurring Median Blur Opencv
- Program #300Computer Vision Image Filtering Bilateral Opencv
- Program #301Computer Vision Morphological Operations Erosion Opencv
- Program #302Computer Vision Morphological Operations Dilation Opencv
- Program #303Computer Vision Morphological Operations Opening Opencv
/*****************************
File Name : CSLAB_HELLO_WORLD_V1
Purpose : A Program for Hello World in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 09:04 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Hello World in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object HelloWorld {
def main(args: Array[String]): Unit = {
println("Hello, world!")
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_VARIABLES_V1
Purpose : A Program for Variables in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 9:15 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Variables in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
var vAR_CSLAB_myVariable : Int = 0;
val vAR_CSLAB_myValue : Int = 1;
vAR_CSLAB_myValue
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_VARIABLE_MULTIPLE_ASSIGNMENTS_V1
Purpose : A Program for Variables in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 9:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Variables in R
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
var vAR_CSLAB_myVariable = 10;
val vAR_CSLAB_myValue = "Hello, Scala!";
val (vAR_CSLAB_myVariable1: Int, vAR_CSLAB_myVariable2: String) = Pair(40, "Foo")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_DATATYPE_INTEGER_V1
Purpose : A Program for Integer Datatype in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 9:37 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Integer Datatype in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
var vAR_CSLAB_a : Int = 12;
vAR_CSLAB_a + 30
val vAR_CSLAB_b : Int = 50;
vAR_CSLAB_6 + 30
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_DATATYPE_FLOAT_V1
Purpose : A Program for Float Datatype in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 9:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Float Datatype in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
var vAR_CSLAB_c = 12.3f;
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_DATATYPE_DOUBLE_V1
Purpose : A Program for Double Datatype in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 9:47 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Double Datatype in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
var vAR_CSLAB_c = 12.3;
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ACCESS_MODIFIER_PRIVATE_MEMBER_V1
Purpose : A Program for Access Modifier - Private Members in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 9:54 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Access Modifier - Private Members in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Example {
private var vAR_CSLAB_a:Int=7
def show(){
vAR_CSLAB_a=8
println(vAR_CSLAB_a)
}
}
object access extends App{
var vAR_CSLAB_e=new Example()
vAR_CSLAB_e.show()
//e.a=8
//println(e.a)
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ACCESS_MODIFIER_PROTECTED_MEMBER_V1
Purpose : A Program for Access Modifier - Protected Members in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 10:09 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Access Modifier - Protected Members in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Example1{
protected var vAR_CSLAB_a:Int=7
def show(){
vAR_CSLAB_a=8
println(vAR_CSLAB_a)
}
}
class Example2 extends Example1{
def show1(){
vAR_CSLAB_a=9
println(vAR_CSLAB_a)
}
}
object access1 extends App{
var vAR_CSLAB_e=new Example()
vAR_CSLAB_e.show()
var vAR_CSLAB_e2=new Example1()
//vAR_CSLAB_e.vAR_CSLAB_a=10
//println(vAR_CSLAB_e.vAR_CSLAB_a)
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ACCESS_MODIFIER_PUBLIC_MEMBER_V1
Purpose : A Program for Access Modifier - Public Members in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 10:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Access Modifier - Public Members in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Example3{
protected var vAR_CSLAB_a:Int=7
def show(){
println(vAR_CSLAB_a)
}
}
class Example4 extends Example3 {
def show1(){
vAR_CSLAB_a=9
println(vAR_CSLAB_a)
}
}
object access2 extends App{
var vAR_CSLAB_e=new Example2()
vAR_CSLAB_e.show()
var vAR_CSLAB_e1=new Example3()
//vAR_CSLAB_e1.show1()
//e.a=10
//println(e.a)
vAR_CSLAB_e1.show()
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_STRING_INTERPOLATION_V1
Purpose : A Program for String Interpolation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 10:39 hrs
Version : 1.0
/*****************************
## Program Description : A Program for String Interpolation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object HelloWorld{
def main(args: Array[String]) {
val vAR_CSLAB_name = "mark"
val vAR_CSLAB_age = 18
println(vAR_CSLAB_name + " is "+ vAR_CSLAB_age + " years old" )
println()
println(s"$vAR_CSLAB_name is $vAR_CSLAB_age years old")
println(f"$vAR_CSLAB_name%s is $vAR_CSLAB_age%d years old")
}
}
HelloWorld.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_S_STRING_INTERPOLATION_V1
Purpose : A Program for "S" String Interpolation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 10:44 hrs
Version : 1.0
/*****************************
## Program Description : A Program for "S" String Interpolation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object HelloWorld{
def main(args: Array[String]) {
val vAR_CSLAB_name = "mark"
val vAR_CSLAB_age = 18
println(vAR_CSLAB_name + " is "+ vAR_CSLAB_age + " years old" )
println()
println(s"$vAR_CSLAB_name is $vAR_CSLAB_age years old")
}
}
HelloWorld.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_F_STRING_INTERPOLATION_V1
Purpose : A Program for "F" String Interpolation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 10:49 hrs
Version : 1.0
/*****************************
## Program Description : A Program for "F" String Interpolation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object HelloWorld{
def main(args: Array[String]) {
val vAR_CSLAB_name = "mark"
val vAR_CSLAB_age = 18
println(vAR_CSLAB_name + " is "+ vAR_CSLAB_age + " years old" )
println()
println(f"$vAR_CSLAB_name%s is $vAR_CSLAB_age%d years old")
}
}
HelloWorld.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_IF_STATEMENT_V1
Purpose : A Program for if Statements in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 11:04 hrs
Version : 1.0
/*****************************
## Program Description : A Program for if Statements in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object IF_STATEMENT {
def main(args: Array[String]) {
var vAR_CSLAB_x = 10;
if( vAR_CSLAB_x < 20 ){
println("This is if statement");
}
}
}
IF_STATEMENT.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_IF_ELSE_STATEMENT_V1
Purpose : A Program for if Else Statements in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 11:10 hrs
Version : 1.0
/*****************************
## Program Description : A Program for if Else Statements in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object IF_ELSE_STATEMENT {
def main(args: Array[String]) {
var vAR_CSLAB_x = 30;
if( vAR_CSLAB_x < 20 ){
println("This is if statement");
}else{
println("This is else statement");
}
}
}
IF_ELSE_STATEMENT.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ELSE_STATEMENT_V1
Purpose : A Program for Else Statements in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 11:24 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Else Statements in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object ELSE_STATEMENT {
def main(args: Array[String]) {
var vAR_CSLAB_x = 30;
if( vAR_CSLAB_x == 10 ){
println("Value of vAR_CSLAB_X is 10");
}else if( vAR_CSLAB_x == 20 ){
println("Value of vAR_CSLAB_X is 20");
}else if( vAR_CSLAB_x == 30 ){
println("Value of vAR_CSLAB_X is 30");
}else{
println("This is else statement");
}
}
}
ELSE_STATEMENT.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_NESTED_IF_STATEMENT_V1
Purpose : A Program for Nested if Statements in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 11:33 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Nested if Statements in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object NESTED_IF_STATEMENT {
def main(args: Array[String]) {
var vAR_CSLAB_x = 30;
var vAR_CSLAB_y = 10;
if( vAR_CSLAB_x == 30 ){
if( vAR_CSLAB_y == 10 ){
println("vAR_CSLAB_X = 30 and vAR_CSLAB_Y = 10");
}
}
}
}
NESTED_IF_STATEMENT.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_BREAK_STATEMENT_V1
Purpose : A Program for Break Statement in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 11:45 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Break Statement in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.util.control._
object BREAK_STATEMENT {
def main(args: Array[String]) {
var a = 0;
val numList = List(1,2,3,4,5,6,7,8,9,10);
val loop = new Breaks;
loop.breakable {
for( a <- numList){
println( "Value of a: " + a );
if( a == 4 ){
loop.break;
}
}
}
println( "After the loop" );
}
}
BREAK_STATEMENT.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_BREAKING_NESTED_LOOPS_V1
Purpose : A Program for Breaking Nested Loops in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 12:01 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Breaking Nested Loops in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.util.control._
object BREAKING_NESTED_LOOP {
def main(args: Array[String]) {
var vAR_CSLAB_a = 0;
var vAR_CSLAB_b = 0;
val vAR_CSLAB_numList1 = List(1,2,3,4,5);
val vAR_CSLAB_numList2 = List(11,12,13);
val vAR_CSLAB_outer = new Breaks;
val vAR_CSLAB_inner = new Breaks;
vAR_CSLAB_outer.breakable {
for(vAR_CSLAB_a <- vAR_CSLAB_numList1){
println( "Value of vAR_CSLAB_a: " + vAR_CSLAB_a );
vAR_CSLAB_inner.breakable {
for(vAR_CSLAB_b <- vAR_CSLAB_numList2){
println( "Value of vAR_CSLAB_b: " + vAR_CSLAB_b );
if( vAR_CSLAB_b == 12 ){
vAR_CSLAB_inner.break;
}
}
} // inner breakable
}
} // outer breakable.
}
}
BREAKING_NESTED_LOOP.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_WHILE_LOOP_V1
Purpose : A Program for While Loop in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 12:14 hrs
Version : 1.0
/*****************************
## Program Description : A Program for While Loop in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object WHILE_LOOP {
def main(args: Array[String]) {
// Local variable declaration:
var vAR_CSLAB_a = 10;
// while loop execution
while( vAR_CSLAB_a < 20 ){
println( "Value of vAR_CSLAB_a: " + vAR_CSLAB_a );
vAR_CSLAB_a = vAR_CSLAB_a + 1;
}
}
}
WHILE_LOOP.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_DO_WHILE_LOOP_V1
Purpose : A Program for Do While Loop in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 12:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Do While Loop in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object DO_WHILE_LOOP {
def main(args: Array[String]) {
// Local variable declaration:
var vAR_CSLAB_a = 10;
// do loop execution
do {
println( "Value of vAR_CSLAB_a: " + vAR_CSLAB_a );
vAR_CSLAB_a = vAR_CSLAB_a + 1;
}
while( vAR_CSLAB_a < 20 )
}
}
DO_WHILE_LOOP.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_FOR_LOOP_V1
Purpose : A Program FOR Loop in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 12:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program FOR Loop in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object FOR_LOOP {
def main(args: Array[String]) {
var vAR_CSLAB_a = 0;
// for loop execution with a range
for( vAR_CSLAB_a <- 1 to 10){
println( "Value of vAR_CSLAB_a: " + vAR_CSLAB_a );
}
}
}
FOR_LOOP.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ADD_INTEGERS_V1
Purpose : A Program for Adding Integers in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 12:50 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Adding Integers in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object add{
def FUNCTION_ADD_INTERGERS ( vAR_CSLAB_a:Int, vAR_CSLAB_b:Int ) : Int = {
var vAR_CSLAB_sum:Int = 0
vAR_CSLAB_sum = vAR_CSLAB_a + vAR_CSLAB_b
return vAR_CSLAB_sum
}
}
add.FUNCTION_ADD_INTERGERS(5, 6)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CALLING_FUNCTIONS_V1
Purpose : A Program for Calling Functions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 13:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Calling Functions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object add{
def FUNCTION_ADD_INTERGERS ( vAR_CSLAB_a:Int, vAR_CSLAB_b:Int ) : Int = {
var vAR_CSLAB_sum:Int = 0
vAR_CSLAB_sum = vAR_CSLAB_a + vAR_CSLAB_b
return vAR_CSLAB_sum
}
}
add.FUNCTION_ADD_INTERGERS(5,6)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_FUNCTIONS_CALL_BY_NAME_V1
Purpose : A Program for Functions Call by Name in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 14:01 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Functions Call by Name in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def FUNCTION_CALL_BY_NAME (args: Array[String]) {
delayed(time());
}
def time() = {
println("Getting time in nano seconds")
System.nanoTime
}
def delayed( t: => Long ) = {
println("In delayed method")
println("Param: " + t)
}
}
Demo.FUNCTION_CALL_BY_NAME(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_FUNCTIONS_WITH_VARIABLE_ARGUMENTS_V1
Purpose : A Program for Functions with Variable Arguments in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 14:17 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Functions with Variable Arguments in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def FUNCTIONS_VARIABLE_ARGUMENTS (args: Array[String]) {
printStrings("Hello", "Scala", "Python");
}
def printStrings( args:String* ) = {
var vAR_CSLAB_i : Int = 0;
for( arg <- args ){
println("Arg value[" + vAR_CSLAB_i + "] = " + arg );
vAR_CSLAB_i = vAR_CSLAB_i + 1;
}
}
}
Demo.FUNCTIONS_VARIABLE_ARGUMENTS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_DEFAULT_PARAMETER_VALUE_FOR_A_FUNCTION_V1
Purpose : A Program for Default Parameter Value for a Function in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 14:32 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Default Parameter Value for a Function in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def main(args: Array[String]) {
println( "Returned Value : " + FUNCTION_ADD_INTERGERS() );
}
def FUNCTION_ADD_INTERGERS ( vAR_CSLAB_a:Int = 5, vAR_CSLAB_b:Int = 7 ) : Int = {
var vAR_CSLAB_sum:Int = 0
vAR_CSLAB_sum = vAR_CSLAB_a + vAR_CSLAB_b
return vAR_CSLAB_sum
}
}
Demo.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_NESTED_FUNCTIONS_V1
Purpose : A Program Nested Functions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 14:48 hrs
Version : 1.0
/*****************************
## Program Description : A Program Nested Functions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def main(args: Array[String]) {
println( FUNCTION_FACTORIAL(0) )
println( FUNCTION_FACTORIAL(1) )
println( FUNCTION_FACTORIAL(2) )
println( FUNCTION_FACTORIAL(3) )
}
def FUNCTION_FACTORIAL (vAR_CSLAB_i: Int): Int = {
def fact(vAR_CSLAB_i: Int, vAR_CSLAB_accumulator: Int): Int = {
if (vAR_CSLAB_i <= 1)
vAR_CSLAB_accumulator
else
fact(vAR_CSLAB_i - 1, vAR_CSLAB_i * vAR_CSLAB_accumulator)
}
fact(vAR_CSLAB_i, 1)
}
}
Demo.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PARTIALLY_APPLIED_FUNCTIONS_V1
Purpose : A Program for Partially Applied Functions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 15:03 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Partially Applied Functions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import java.util.Date
object Demo {
def PARTIAL_FUNCTIONS (args: Array[String]) {
val vAR_CSLAB_date = new Date
log(vAR_CSLAB_date, "message1" )
Thread.sleep(1000)
log(vAR_CSLAB_date, "message2" )
Thread.sleep(1000)
log(vAR_CSLAB_date, "message3" )
}
def log(vAR_CSLAB_date: Date, vAR_CSLAB_message: String) = {
println(vAR_CSLAB_date + "----" + vAR_CSLAB_message)
}
}
Demo.PARTIAL_FUNCTIONS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_FOR_LOOP_V1
Purpose : A Program FOR Loop in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 12:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program FOR Loop in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object FOR_LOOP {
def main(args: Array[String]) {
var vAR_CSLAB_a = 0;
// for loop execution with a range
for( vAR_CSLAB_a <- 1 to 10){
println( "Value of vAR_CSLAB_a: " + vAR_CSLAB_a );
}
}
}
FOR_LOOP.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_RECURSION_FUNCTION_V1
Purpose : A Program for Recursion Function in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 15:39 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Recursion Function in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def RECURSION_FUNCTION (args: Array[String]) {
for (vAR_CSLAB_i <- 1 to 10)
println( "Factorial of " + vAR_CSLAB_i + ": = " + factorial(vAR_CSLAB_i) )
}
def factorial(vAR_CSLAB_n: BigInt): BigInt = {
if (vAR_CSLAB_n <= 1)
1
else
vAR_CSLAB_n * factorial(vAR_CSLAB_n - 1)
}
}
Demo.RECURSION_FUNCTION(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_HIGHER_ORDER_FUNCTIONS_V1
Purpose : A Program for Higher Order Functions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 15:53 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Higher Order Functions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def HIGHER_ORDER_FUNCTION (args: Array[String]) {
println( apply( layout, 10) )
}
def apply(f: Int => String, v: Int) = f(v)
def layout[A](x: A) = "[" + x.toString() + "]"
}
Demo.HIGHER_ORDER_FUNCTION(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ANONYMOUS_FUNCTIONS_V1
Purpose : A Program for Anonymous Functions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 16:07 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Anonymous Functions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
var inc = (vAR_CSLAB_x:Int) => vAR_CSLAB_x+1
//var vAR_CSLAB_x = inc(7)
//var vAR_CSLAB_x = inc(7)
var mul = (vAR_CSLAB_x: Int, vAR_CSLAB_y: Int) => vAR_CSLAB_x*vAR_CSLAB_y
println(mul(3, 4))
var userDir = () => { System.getProperty("user.dir") }
println( userDir )
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CURRYING_FUNCTIONS_V1
Purpose : A Program for Currying Functions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 16:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Currying Functions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CURRYING_FUNCTIONS (args: Array[String]) {
val vAR_CSLAB_str1:String = "Hello, "
val vAR_CSLAB_str2:String = "Scala!"
println( "vAR_CSLAB_str1 + vAR_CSLAB_str2 = " + strcat(vAR_CSLAB_str1)(vAR_CSLAB_str2) )
}
def strcat(vAR_CSLAB_s1: String)(vAR_CSLAB_s2: String) = {
vAR_CSLAB_s1 + vAR_CSLAB_s2
}
}
Demo.CURRYING_FUNCTIONS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CLOSURE_V1
Purpose : A Program for Closure in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 16:38 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Currying Functions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CLOSURE (args: Array[String]) {
println( "vAR_CSLAB_multiplier(1) value = " + vAR_CSLAB_multiplier(1) )
println( "vAR_CSLAB_multiplier(2) value = " + vAR_CSLAB_multiplier(2) )
}
var vAR_CSLAB_factor = 3
val vAR_CSLAB_multiplier = (vAR_CSLAB_i:Int) => vAR_CSLAB_i * vAR_CSLAB_factor
}
Demo.CLOSURE(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CREATING_A_STRING_V1
Purpose : A Program for Creation of Strings in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 16:54 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Creation of Strings in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
val vAR_CSLAB_greeting: String = "Hello, world!"
def main(args: Array[String]) {
println( vAR_CSLAB_greeting )
}
}
Demo.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CONCATENATING_STRINGS_V1
Purpose : A Program for Concatenating Strings in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 17:08 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Creation of Strings in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CONCATENATING_STRINGS (args: Array[String]) {
var vAR_CSLAB_str1 = "Dot saw I was ";
var vAR_CSLAB_str2 = "Tod";
println("Dot " + vAR_CSLAB_str1 + vAR_CSLAB_str2);
}
}
Demo.CONCATENATING_STRINGS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CREATING_FORMAT_STRINGS_V1
Purpose : A Program for Creating Format Strings in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 17:24 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Creating Format Strings in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CREATING_FORMAT_STRINGS (args: Array[String]) {
var vAR_CSLAB_floatVar = 12.456
var vAR_CSLAB_intVar = 2000
var vAR_CSLAB_stringVar = "Hello, Scala!"
var vAR_CSLAB_fs = printf("The value of the float variable is " + "%f, while the value of the integer " + "variable is %d, and the string " + "is %s", vAR_CSLAB_floatVar, vAR_CSLAB_intVar, vAR_CSLAB_stringVar);
println(vAR_CSLAB_fs)
}
}
Demo.CREATING_FORMAT_STRINGS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_DECLARING_ARRAY_VARIABLES_V1
Purpose : A Program for Declaring Array Variables in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 17:43 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Declaring Array Variables in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CREATING_FORMAT_STRINGS (args: Array[String]) {
var vAR_CSLAB_z = Array("Zara", "Nuha", "Ayan")
}
}
Demo.CREATING_FORMAT_STRINGS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PROCESSING_ARRAYS_V1
Purpose : A Program for Processing Arrays in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 30/01/2019 18:07 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Processing Arrays in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def PROCESSING_ARRAYS (args: Array[String]) {
var vAR_CSLAB_myList = Array(1.9, 2.9, 3.4, 3.5)
// Print all the array elements
for ( vAR_CSLAB_x <- vAR_CSLAB_myList ) {
println( vAR_CSLAB_x )
}
// Summing all elements
var vAR_CSLAB_total = 0.0;
for ( vAR_CSLAB_i <- 0 to (vAR_CSLAB_myList.length - 1)) {
vAR_CSLAB_total += vAR_CSLAB_myList(vAR_CSLAB_i);
}
println("Total is " + vAR_CSLAB_total);
// Finding the largest element
var vAR_CSLAB_max = vAR_CSLAB_myList(0);
for ( vAR_CSLAB_i <- 1 to (vAR_CSLAB_myList.length - 1) ) {
if (vAR_CSLAB_myList(vAR_CSLAB_i) > vAR_CSLAB_max) vAR_CSLAB_max = vAR_CSLAB_myList(vAR_CSLAB_i);
}
println("Max is " + vAR_CSLAB_max);
}
}
Demo.PROCESSING_ARRAYS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_MULTI_DIMENSIONAL_ARRAYS_V1
Purpose : A Program for Multi-Dimensional Arrays in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 9:47 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Multi-Dimensional Arrays in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import Array._
object Demo {
def MULTI_DIMENSIONAL_ARRAYS (args: Array[String]) {
var vAR_CSLAB_myMatrix = ofDim[Int](3,3)
// build a matrix
for (vAR_CSLAB_i <- 0 to 2) {
for ( vAR_CSLAB_j <- 0 to 2) {
vAR_CSLAB_myMatrix(vAR_CSLAB_i)(vAR_CSLAB_j) = vAR_CSLAB_j;
}
}
// Print two dimensional array
for (vAR_CSLAB_i <- 0 to 2) {
for ( vAR_CSLAB_j <- 0 to 2) {
print(" " + vAR_CSLAB_myMatrix(vAR_CSLAB_i)(vAR_CSLAB_j));
}
println();
}
}
}
Demo.MULTI_DIMENSIONAL_ARRAYS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CONCATENATIONG_ARRAYS_V1
Purpose : A Program for Concatenating Arrays in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 10:12 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Concatenating Arrays in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import Array._
object Demo {
def CONCATENATING_ARRAYS (args: Array[String]) {
var vAR_CSLAB_myList1 = Array(1.9, 2.9, 3.4, 3.5)
var vAR_CSLAB_myList2 = Array(8.9, 7.9, 0.4, 1.5)
var vAR_CSLAB_myList3 = concat( vAR_CSLAB_myList1, vAR_CSLAB_myList2)
// Print all the array elements
for ( vAR_CSLAB_x <- vAR_CSLAB_myList3 ) {
println( vAR_CSLAB_x )
}
}
}
Demo.CONCATENATING_ARRAYS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CREATING_ARRAYS_WITH_RANGE_V1
Purpose : A Program for Creating Arrays With Range in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 10:25 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Creating Arrays With Range in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import Array._
object Demo {
def ARRAYS_WITH_RANGE (args: Array[String]) {
var vAR_CSLAB_myList1 = range(10, 20, 2)
var vAR_CSLAB_myList2 = range(10,20)
// Print all the array elements
for ( vAR_CSLAB_x <- vAR_CSLAB_myList1 ) {
print( " " + vAR_CSLAB_x )
}
println()
for ( vAR_CSLAB_x <- vAR_CSLAB_myList2 ) {
print( " " + vAR_CSLAB_x )
}
}
}
Demo.ARRAYS_WITH_RANGE(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_COLLECTIONS_V1
Purpose : A Program for Collections in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 10:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Collections in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
// Define List of integers.
val vAR_CSLAB_x = List(1,2,3,4)
// Define a set.
var vAR_CSLAB_x1 = Set(1,3,5,7)
// Define a map.
val vAR_CSLAB_x2 = Map("one" -> 1, "two" -> 2, "three" -> 3)
// Create a tuple of two elements.
val vAR_CSLAB_x3 = (10, "Scala")
// Define an option
val vAR_CSLAB_x4:Option[Int] = Some(5)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LISTS_V1
Purpose : A Program for Lists in Collections in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 11:02 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Lists in Collections in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def LISTS (args: Array[String]) {
val vAR_CSLAB_fruit = "apples" :: ("oranges" :: ("pears" :: Nil))
val vAR_CSLAB_nums = Nil
println( "Head of fruit : " + vAR_CSLAB_fruit.head )
println( "Tail of fruit : " + vAR_CSLAB_fruit.tail )
println( "Check if fruit is empty : " + vAR_CSLAB_fruit.isEmpty )
println( "Check if nums is empty : " + vAR_CSLAB_nums.isEmpty )
}
}
Demo.LISTS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_COLLECTIONS_CONCATENATING_LISTS_V1
Purpose : A Program for Concatenating Lists in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 11:18 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Concatenating Lists in Collections in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CONCATENATING_LISTS (args: Array[String]) {
val vAR_CSLAB_fruit1 = "apples" :: ("oranges" :: ("pears" :: Nil))
val vAR_CSLAB_fruit2 = "mangoes" :: ("banana" :: Nil)
// use two or more lists with ::: operator
var vAR_CSLAB_fruit = vAR_CSLAB_fruit1 ::: vAR_CSLAB_fruit2
println( "vAR_CSLAB_fruit1 ::: vAR_CSLAB_fruit2 : " + vAR_CSLAB_fruit )
// use two lists with Set.:::() method
vAR_CSLAB_fruit = vAR_CSLAB_fruit1.:::(vAR_CSLAB_fruit2)
println( "vAR_CSLAB_fruit1.:::(vAR_CSLAB_fruit2) : " + vAR_CSLAB_fruit )
// pass two or more lists as arguments
vAR_CSLAB_fruit = List.concat(vAR_CSLAB_fruit1, vAR_CSLAB_fruit2)
println( "List.concat(vAR_CSLAB_fruit1, vAR_CSLAB_fruit2) : " + vAR_CSLAB_fruit)
}
}
Demo.CONCATENATING_LISTS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_COLLECTIONS_UNIFORM_LISTS_V1
Purpose : A Program for Uniform Lists in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 11:34 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Uniform Lists in Collections in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def UNIFORM_LISTS (args: Array[String]) {
val vAR_CSLAB_fruit = List.fill(3)("apples") // Repeats apples three times.
println( "vAR_CSLAB_fruit : " + vAR_CSLAB_fruit )
val vAR_CSLAB_num = List.fill(10)(2) // Repeats 2, 10 times.
println( "num : " + vAR_CSLAB_num )
}
}
Demo.UNIFORM_LISTS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_SETS_V1
Purpose : A Program for Sets in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 11:51 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Sets in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def SETS (args: Array[String]) {
val vAR_CSLAB_fruit = Set("apples", "oranges", "pears")
val vAR_CSLAB_nums: Set[Int] = Set()
println( "Head of fruit : " + vAR_CSLAB_fruit.head )
println( "Tail of fruit : " + vAR_CSLAB_fruit.tail )
println( "Check if fruit is empty : " + vAR_CSLAB_fruit.isEmpty )
println( "Check if nums is empty : " + vAR_CSLAB_nums.isEmpty )
}
}
Demo.SETS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CONCATENATING_SETS_V1
Purpose : A Program for Concatenating Sets in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 12:07 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Concatenating Sets in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CONCATENATING_SETS (args: Array[String]) {
val vAR_CSLAB_fruit1 = Set("apples", "oranges", "pears")
val vAR_CSLAB_fruit2 = Set("mangoes", "banana")
// use two or more sets with ++ as operator
var vAR_CSLAB_fruit = vAR_CSLAB_fruit1 ++ vAR_CSLAB_fruit2
println( "vAR_CSLAB_fruit1 ++ vAR_CSLAB_fruit2 : " + vAR_CSLAB_fruit )
// use two sets with ++ as method
vAR_CSLAB_fruit = vAR_CSLAB_fruit1.++(vAR_CSLAB_fruit2)
println( "vAR_CSLAB_fruit1.++(vAR_CSLAB_fruit2) : " + vAR_CSLAB_fruit )
}
}
Demo.CONCATENATING_SETS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ELEMENTS_IN_SETS_V1
Purpose : A Program for Elements in Sets in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 12:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Elements in Sets in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def ELEMENTS_IN_SETS (args: Array[String]) {
val vAR_CSLAB_num = Set(5,6,9,20,30,45)
// find min and max of the elements
println( "Min element in Set(5,6,9,20,30,45) : " + vAR_CSLAB_num.min )
println( "Max element in Set(5,6,9,20,30,45) : " + vAR_CSLAB_num.max )
}
}
Demo.ELEMENTS_IN_SETS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_COMMON_VALUES_IN_SETS_V1
Purpose : A Program for Common Values in Sets in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 12:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Common Values in Sets in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def COMMON_VALUES_IN_SETS (args: Array[String]) {
val vAR_CSLAB_num1 = Set(5,6,9,20,30,45)
val vAR_CSLAB_num2 = Set(50,60,9,20,35,55)
// find common elements between two sets
println( "vAR_CSLAB_num1.&(vAR_CSLAB_num2) : " + vAR_CSLAB_num1.&(vAR_CSLAB_num2) )
println( "vAR_CSLAB_num1.intersect(vAR_CSLAB_num2) : " + vAR_CSLAB_num1.intersect(vAR_CSLAB_num2) )
}
}
Demo.COMMON_VALUES_IN_SETS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_MAPS_V1
Purpose : A Program for Maps in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 12:57 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Maps in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def MAPS (args: Array[String]) {
val vAR_CSLAB_colors = Map("red" -> "#FF0000", "azure" -> "#F0FFFF", "peru" -> "#CD853F")
val vAR_CSLAB_nums: Map[Int, Int] = Map()
println( "Keys in colors : " + vAR_CSLAB_colors.keys )
println( "Values in colors : " + vAR_CSLAB_colors.values )
println( "Check if colors is empty : " + vAR_CSLAB_colors.isEmpty )
println( "Check if nums is empty : " + vAR_CSLAB_nums.isEmpty )
}
}
Demo.MAPS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CONCATENATING_MAPS_V1
Purpose : A Program for Concatenating Maps in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 13:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Concatenating Maps in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CONCATENATONG_MAPS (args: Array[String]) {
val vAR_CSLAB_colors1 = Map("red" -> "#FF0000", "azure" -> "#F0FFFF", "peru" -> "#CD853F")
val vAR_CSLAB_colors2 = Map("blue" -> "#0033FF", "yellow" -> "#FFFF00", "red" -> "#FF0000")
// use two or more Maps with ++ as operator
var vAR_CSLAB_colors = vAR_CSLAB_colors1 ++ vAR_CSLAB_colors2
println( "vAR_CSLAB_colors1 ++ vAR_CSLAB_colors2 : " + vAR_CSLAB_colors )
// use two maps with ++ as method
vAR_CSLAB_colors = vAR_CSLAB_colors1.++(vAR_CSLAB_colors2)
println( "vAR_CSLAB_colors1.++(vAR_CSLAB_colors2)) : " + vAR_CSLAB_colors )
}
}
Demo.CONCATENATONG_MAPS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PRINT_KEYS_VALUES_FROM_MAPS_V1
Purpose : A Program for Printing Keys & Values from Maps in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 14:02 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Printing Keys & Values from Maps in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def KEYS_VALUES_FROM_MAPS(args: Array[String]) {
val vAR_CSLAB_colors = Map("red" -> "#FF0000", "azure" -> "#F0FFFF","peru" -> "#CD853F")
vAR_CSLAB_colors.keys.foreach{ vAR_CSLAB_i =>
print( "Key = " + vAR_CSLAB_i )
println(" Value = " + vAR_CSLAB_colors(vAR_CSLAB_i) )}
}
}
Demo.KEYS_VALUES_FROM_MAPS (Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CHECK_FOR_KEYS_IN_MAP_V1
Purpose : A Program for Checking Keys in Maps in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 14:19 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Checking Keys in Maps in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def CHECK_KEYSS_FOR_A_MAP (args: Array[String]) {
val vAR_CSLAB_colors = Map("red" -> "#FF0000", "azure" -> "#F0FFFF", "peru" -> "#CD853F")
if( vAR_CSLAB_colors.contains( "red" )) {
println("Red key exists with value :" + vAR_CSLAB_colors("red"))
} else {
println("Red key does not exist")
}
if( vAR_CSLAB_colors.contains( "maroon" )) {
println("Maroon key exists with value :" + vAR_CSLAB_colors("maroon"))
} else {
println("Maroon key does not exist")
}
}
}
Demo.CHECK_KEYSS_FOR_A_MAP(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_TUPLE_V1
Purpose : A Program for Tuple in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 14:35 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Tuple in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def TUPLE (args: Array[String]) {
val vAR_CSLAB_t = (4,3,2,1)
val vAR_CSLAB_sum = vAR_CSLAB_t._1 + vAR_CSLAB_t._2 + vAR_CSLAB_t._3 + vAR_CSLAB_t._4
println( "Sum of elements: " + vAR_CSLAB_sum )
}
}
Demo.TUPLE(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ITERATING_OVER_A_TUPLE_V1
Purpose : A Program for Iterating Over a Tuple in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 14:51 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Iterating Over a Tuple in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def ITERATING_OVER_A_TUPLE (args: Array[String]) {
val vAR_CSLAB_t = (4,3,2,1)
vAR_CSLAB_t.productIterator.foreach{ vAR_CSLAB_i =>println("Value = " + vAR_CSLAB_i )}
}
}
Demo.ITERATING_OVER_A_TUPLE (Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_TUPLE_TO_STRING_V1
Purpose : A Program for Converting Tuple to String in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 15:08 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Converting Tuple to String in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def TUPLE_TO_STRING (args: Array[String]) {
val t = new Tuple3(1, "hello", Console)
println("Concatenated String: " + t.toString() )
}
}
Demo.TUPLE_TO_STRING (Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_SWAP_ELEMENTS_OF_A_TUPLE_V1
Purpose : A Program for Swapping Elements of a Tuple in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 15:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Converting Tuple to String in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def SWAP_ELEMENTS_OF_A_TUPLE (args: Array[String]) {
val vAR_CSLAB_t = new Tuple2("Scala", "hello")
println("Swapped Tuple: " + vAR_CSLAB_t.swap )
}
}
Demo.SWAP_ELEMENTS_OF_A_TUPLE (Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_OPTION_TYPE_V1
Purpose : A Program for Option Type in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 15:39 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Option Type in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def OPTION_TYPE(args: Array[String]) {
val vAR_CSLAB_capitals = Map("France" -> "Paris", "Japan" -> "Tokyo")
println("show(vAR_CSLAB_capitals.get( \"Japan\")) : " + show(vAR_CSLAB_capitals.get( "Japan")) )
println("show(vAR_CSLAB_capitals.get( \"India\")) : " + show(vAR_CSLAB_capitals.get( "India")) )
}
def show(vAR_CSLAB_x: Option[String]) = vAR_CSLAB_x match {
case Some(vAR_CSLAB_s) => vAR_CSLAB_s
case None => "?"
}
}
Demo.OPTION_TYPE(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_OPTION_TYPE_GET_OR_ELSE_METHOD_V1
Purpose : A Program for Option Type - Get or Else Method in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 15:55 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Option Type - Get or Else Method in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def GET_OR_ELSE_METHOD (args: Array[String]) {
val vAR_CSLAB_a:Option[Int] = Some(5)
val vAR_CSLAB_b:Option[Int] = None
println("vAR_CSLAB_a.getOrElse(0): " + vAR_CSLAB_a.getOrElse(0) )
println("vAR_CSLAB_b.getOrElse(10): " + vAR_CSLAB_b.getOrElse(10) )
}
}
Demo.GET_OR_ELSE_METHOD(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_OPTION_TYPE_EMPTY_METHOD_V1
Purpose : A Program for Option Type - Empty Method in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 16:12 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Option Type - Empty Method in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def EMPTY_METHOD (args: Array[String]) {
val vAR_CSLAB_a:Option[Int] = Some(5)
val vAR_CSLAB_b:Option[Int] = None
println("vAR_CSLAB_a.isEmpty: " + vAR_CSLAB_a.isEmpty )
println("vAR_CSLAB_b.isEmpty: " + vAR_CSLAB_b.isEmpty )
}
}
Demo.EMPTY_METHOD(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_OPTION_TYPE_ITERATORS_V1
Purpose : A Program for Option Type - Iterators in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 16:29 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Option Type - Iterators in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def ITERATORS (args: Array[String]) {
val vAR_CSLAB_it = Iterator("a", "number", "of", "words")
while (vAR_CSLAB_it.hasNext){
println(vAR_CSLAB_it.next())
}
}
}
Demo.ITERATORS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_OPTION_TYPE_ELEMENTS_FROM_ITERATORS_V1
Purpose : A Program for Option Type - Elements from Iterators in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 16:49 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Option Type - Elements from Iterators in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def ELEMENTS_FROM_ITERATORS (args: Array[String]) {
val vAR_CSLAB_ita = Iterator(20,40,2,50,69, 90)
val vAR_CSLAB_itb = Iterator(20,40,2,50,69, 90)
println("Maximum valued element " + vAR_CSLAB_ita.max )
println("Minimum valued element " + vAR_CSLAB_itb.min )
}
}
Demo.ELEMENTS_FROM_ITERATORS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_OPTION_TYPE_LENGTH_OF_AN_ITERATOR_V1
Purpose : A Program for Option Type - Length of an Iterator in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 17:18 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Option Type - Length of an Iterator in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def LENGTH_OF_AN_ITERATOR (args: Array[String]) {
val vAR_CSLAB_ita = Iterator(20,40,2,50,69, 90)
val vAR_CSLAB_itb = Iterator(20,40,2,50,69, 90)
println("Value of vAR_CSLAB_ita.size : " + vAR_CSLAB_ita.size )
println("Value of vAR_CSLAB_itb.length : " + vAR_CSLAB_itb.length )
}
}
Demo.LENGTH_OF_AN_ITERATOR (Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_TRAITS_V1
Purpose : A Program for Traits in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 17:37 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Traits in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
trait Equal {
def isEqual(vAR_CSLAB_x: Any): Boolean
def isNotEqual(vAR_CSLAB_x: Any): Boolean = !isEqual(vAR_CSLAB_x)
}
class Point(vAR_CSLAB_xc: Int, vAR_CSLAB_yc: Int) extends Equal {
var vAR_CSLAB_x: Int = vAR_CSLAB_xc
var vAR_CSLAB_y: Int = vAR_CSLAB_yc
def isEqual(obj: Any) = obj.isInstanceOf[Point] && obj.asInstanceOf[Point].vAR_CSLAB_x == vAR_CSLAB_y
}
object Demo {
def TRAITS (args: Array[String]) {
val vAR_CSLAB_p1 = new Point(2, 3)
val vAR_CSLAB_p2 = new Point(2, 4)
val vAR_CSLAB_p3 = new Point(3, 3)
println(vAR_CSLAB_p1.isNotEqual(vAR_CSLAB_p2))
println(vAR_CSLAB_p1.isNotEqual(vAR_CSLAB_p3))
println(vAR_CSLAB_p1.isNotEqual(2))
}
}
Demo.TRAITS (Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PATTERN_MATCHING_V1
Purpose : A Program for Pattern Matching in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 31/01/2019 17:58 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Pattern Matching in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def PATTERN_MATCHING (args: Array[String]) {
println(matchTest("two"))
println(matchTest("test"))
println(matchTest(1))
}
def matchTest(vAR_CSLAB_x: Any): Any = vAR_CSLAB_x match {
case 1 => "one"
case "two" => 2
case vAR_CSLAB_y: Int => "scala.Int"
case _ => "many"
}
}
Demo.PATTERN_MATCHING(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PATTERN_MATCHING_USING_CASE_CLASSES_V1
Purpose : A Program for Pattern Matching Using Case Classes in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 09:37 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Pattern Matching Using Case Classes in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def PATTERN_MATCHING_CASE_CLASSES (args: Array[String]) {
val vAR_CSLAB_alice = new vAR_CSLAB_Person("Alice", 25)
val vAR_CSLAB_bob = new vAR_CSLAB_Person("Bob", 32)
val vAR_CSLAB_charlie = new vAR_CSLAB_Person("Charlie", 32)
for (vAR_CSLAB_person <- List(vAR_CSLAB_alice, vAR_CSLAB_bob, vAR_CSLAB_charlie)) {
vAR_CSLAB_person match {
case vAR_CSLAB_Person("Alice", 25) => println("Hi Alice!")
case vAR_CSLAB_Person("Bob", 32) => println("Hi Bob!")
case vAR_CSLAB_Person(vAR_CSLAB_name, vAR_CSLAB_age) => println(
"Age: " + vAR_CSLAB_age + " year, name: " + vAR_CSLAB_name + "?")
}
}
}
case class vAR_CSLAB_Person(vAR_CSLAB_name: String, vAR_CSLAB_age: Int)
}
Demo.PATTERN_MATCHING_CASE_CLASSES(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_REGULAR_EXPRESSIONS_V1
Purpose : A Program for Regular Expressions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 09:58 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Regular Expressions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.util.matching.Regex
object Demo {
def REGULAR_EXPRESSIONS (args: Array[String]) {
val vAR_CSLAB_pattern = new Regex("(S|s)cala")
val vAR_CSLAB_str = "Scala is scalable and cool"
println((vAR_CSLAB_pattern findAllIn vAR_CSLAB_str).mkString(","))
}
}
Demo.REGULAR_EXPRESSIONS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_REGULAR_EXPRESSIONS_REPLACE_MATCHING_TEXT_V1
Purpose : A Program for Regular Expressions - Replace Matching Text in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 10:12 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Regular Expressions - Replace Matching Text in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.util.matching.Regex
object Demo {
def REGULAR_EXPRESSIONS_REPLACE_MATCHING_TEXT (args: Array[String]) {
val vAR_CSLAB_pattern = "(S|s)cala".r
val vAR_CSLAB_str = "Scala is scalable and cool"
println(vAR_CSLAB_pattern replaceFirstIn(vAR_CSLAB_str, "Java"))
}
}
Demo.REGULAR_EXPRESSIONS_REPLACE_MATCHING_TEXT(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_EXCEPTION_HANDLING_CATCHING_EXCEPTIONS_V1
Purpose : A Program for Exception Handling - Catching Exceptions in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 10:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Exception Handling - Catching Exceptions in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import java.io.FileReader
import java.io.FileNotFoundException
import java.io.IOException
object Demo {
def CATCHING_EXCEPTIONS(args: Array[String]) {
try {
val vAR_CSLAB_f = new FileReader("input.txt")
} catch {
case ex: FileNotFoundException =>{
println("Missing file exception")
}
case ex: IOException => {
println("IO Exception")
}
}
}
}
Demo.CATCHING_EXCEPTIONS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_EXCEPTION_HANDLING_FINALLY_CLAUSE_V1
Purpose : A Program for Exception Handling - Finally Clause in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 10:43 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Exception Handling - Finally Clause in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import java.io.FileReader
import java.io.FileNotFoundException
import java.io.IOException
object Demo {
def FINALLY_CLAUSE(args: Array[String]) {
try {
val vAR_CSLAB_f = new FileReader("input.txt")
} catch {
case ex: FileNotFoundException => {
println("Missing file exception")
}
case ex: IOException => {
println("IO Exception")
}
} finally {
println("Exiting finally...")
}
}
}
Demo.FINALLY_CLAUSE(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_EXTRACTORS_V1
Purpose : A Program for Extractors in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 11:05 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Extractors in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def EXTRACTORS (args: Array[String]) {
println ("Apply method : " + apply("Zara", "gmail.com"));
println ("Unapply method : " + unapply("Zara@gmail.com"));
println ("Unapply method : " + unapply("Zara Ali"));
}
// The injection method (optional)
def apply(vAR_CSLAB_user: String, domain: String) = {
vAR_CSLAB_user +"@"+ domain
}
// The extraction method (mandatory)
def unapply(vAR_CSLAB_str: String): Option[(String, String)] = {
val vAR_CSLAB_parts = vAR_CSLAB_str split "@"
if (vAR_CSLAB_parts.length == 2){
Some(vAR_CSLAB_parts(0), vAR_CSLAB_parts(1))
} else {
None
}
}
}
Demo.EXTRACTORS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PATTERN_MATCHING_WITH_EXTRACTORS_V1
Purpose : A Program for Pattern Matching WITH Extractors in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 11:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Pattern Matching in Extractors in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def PATTERN_MATCHING_WITH_EXTRACTORS (args: Array[String]) {
val vAR_CSLAB_x = Demo(5)
println(vAR_CSLAB_x)
vAR_CSLAB_x match {
case Demo(num) => println(vAR_CSLAB_x+" is bigger two times than "+num)
//unapply is invoked
case _ => println("i cannot calculate")
}
}
def apply(vAR_CSLAB_x: Int) = vAR_CSLAB_x*2
def unapply(vAR_CSLAB_z: Int): Option[Int] = if (vAR_CSLAB_z%2==0) Some(vAR_CSLAB_z/2) else None
}
Demo.PATTERN_MATCHING_WITH_EXTRACTORS(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_FILES_IO_READING_FROM_COMMAND_LINE_V1
Purpose : A Program for Reading from Command Line in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 11:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reading from Command Line in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Demo {
def FILES_IO_READ_FROM_COMMAND_LINE (args: Array[String]) {
print("Please enter your input : " )
val vAR_CSLAB_line = Console.readLine
println("Thanks, you just typed: " + vAR_CSLAB_line)
}
}
Demo.FILES_IO_READ_FROM_COMMAND_LINE(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_FILES_IO_READING_FROM_A_FILE_CONTENT_V1
Purpose : A Program for Reading from a File Content in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/02/2019 11:59 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reading from a File Content in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
object Demo {
def FILE_IO_READING_FILE_CONTENT(args: Array[String]) {
println("Following is the content read:" )
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
Source.fromFile(vAR_CSLAB_FILE_PATH).foreach {
print
}
}
}
Demo.FILE_IO_READING_FILE_CONTENT(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_IMPORTING_TEXT_FILE_IN_SCALA_V1
Purpose : A Program for Importing Data from a Text File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 09:32 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Importing Data from a Text File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
for(vAR_CSLAB_line <- Source.fromFile(vAR_CSLAB_FILE_PATH).getLines())
println(vAR_CSLAB_line)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_WRITING_TEXT_FILE_IN_SCALA_V1
Purpose : A Program for Writing Data to Text Files in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 9:43 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Writing Data to Text Files in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import java.io.File
import java.io.PrintWriter
import scala.io.Source
object Write {
def main(args: Array[String]) {
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "/Write_to_Text.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_writer = new PrintWriter(new File(vAR_CSLAB_FILE_PATH))
vAR_CSLAB_writer.write("Hello Developer, Welcome to Scala Programming.")
vAR_CSLAB_writer.close()
Source.fromFile(vAR_CSLAB_FILE_PATH).foreach { x => print(x) }
}
}
Write.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_IMPORTING_CSV_FILE_IN_SCALA_V1
Purpose : A Program for Reading from a CSV File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 9:55 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reading from a CSV File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Unit2_Program78_Read.csv";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
for(vAR_CSLAB_line <- Source.fromFile(vAR_CSLAB_FILE_PATH).getLines())
println(vAR_CSLAB_line)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_READING_BINARY_FILES_IN_SCALA_V1
Purpose : A Program for Reading Binary Files in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 10:14 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reading Binary Files in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import java.nio.file.{Files, Paths}
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "sample.bin";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_byteArray = Files.readAllBytes(Paths.get(vAR_CSLAB_FILE_PATH))
vAR_CSLAB_byteArray
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_READING_FILE_FROM_EXCEL_IN_SCALA_V1
Purpose : A Program for Reading an Excel File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 10:31 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reading an Excel File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import java.io._
object Test {
def main(args: Array[String]) {
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Unit2_Program80_Read.xlsx";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_writer = new PrintWriter(new File(vAR_CSLAB_FILE_PATH))
vAR_CSLAB_writer.write("Hello Scala")
vAR_CSLAB_writer.close()
}
}
Test.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_IMPORTING_XML_FILE_IN_SCALA_V1
Purpose : Code for Imporing XML File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 01/18/2015 11:37 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reading an Excel File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.xml.XML
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Unit2_Program81_Read_XML.xml";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_xml = XML.loadFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_temp = (vAR_CSLAB_xml \\ "channel" \\ "item" \ "condition" \ "@temp") text
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_READING_FILE_FROM_URL_IN_SCALA_V1
Purpose : A Program for Reading from an URL in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 10:49 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reading from an URL in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
val vAR_CSLAB_holmesUrl = "http://www.gutenberg.org/cache/epub/1661/pg1661.txt"
for (line <- Source.fromURL(vAR_CSLAB_holmesUrl).getLines)
println(line)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PROCESSING_CHARACTERS_IN_A_TEXT_FILE_SCALA_V1
Purpose : A Program for Processing Characters from a Text File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 11:13 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Processing Characters from a Text File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
for(vAR_CSLAB_source <- Source.fromFile(vAR_CSLAB_FILE_PATH).getLines())
for (vAR_CSLAB_char <- vAR_CSLAB_source) {
println(vAR_CSLAB_char.toUpper)
}
//source.close
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LOADING_DATASET_IN_SCALA_V1
Purpose : A Program for Loading a Dataset in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 11:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Loading a Dataset in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Cash_Flow.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_filename = vAR_CSLAB_FILE_PATH
for (vAR_CSLAB_line <- Source.fromFile(vAR_CSLAB_filename).getLines()) {
println(vAR_CSLAB_line)
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LOADING_DATASET_HANDLING_EXCEPTIONS_IN_SCALA_V1
Purpose : A Program for Handling Exceptions while Loading a Dataset in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 11:47 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Handling Exceptions while Loading a Dataset in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
val vAR_CSLAB_filename = "no-such-file.scala"
try {
for (vAR_CSLAB_line <- Source.fromFile(vAR_CSLAB_filename).getLines()) {
println(vAR_CSLAB_line)
}
} catch {
case ex: Exception => println("Bummer, an exception happened.")
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CORRELATION_MODEL_V1
Purpose : A Program for Correlation Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 12:15 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Correlation Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg._
import org.apache.spark.mllib.stat.Statistics
val vAR_CSLAB_sp = sc.parallelize(List(
Vectors.dense(2100,1620000),
Vectors.dense(2300,1690000),
Vectors.dense(2046,1400000),
Vectors.dense(4314,2000000),
Vectors.dense(1244,1060000),
Vectors.dense(4608,3830000),
Vectors.dense(2173,1230000),
Vectors.dense(2750,2400000),
Vectors.dense(4010,3280000),
Vectors.dense(1959,1480000)
))
val vAR_CSLAB_corr = Statistics.corr(vAR_CSLAB_sp)
vAR_CSLAB_corr
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_STRATIFIED_SAMPLING_MODEL_V1
Purpose : A Program for Startified Sampling Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 12:33 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Startified Sampling Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
// an RDD[(K, V)] of any key value pairs
val vAR_CSLAB_data = sc.parallelize(
Seq((1, 'a'), (1, 'b'), (2, 'c'), (2, 'd'), (2, 'e'), (3, 'f')))
// specify the exact fraction desired from each key
val vAR_CSLAB_fractions = Map(1 -> 0.1, 2 -> 0.6, 3 -> 0.3)
// Get an approximate sample from each stratum
val vAR_CSLAB_approxSample = vAR_CSLAB_data.sampleByKey(withReplacement = false, fractions = vAR_CSLAB_fractions)
// Get an exact sample from each stratum
val exactSample = vAR_CSLAB_data.sampleByKeyExact(withReplacement = false, fractions = vAR_CSLAB_fractions)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_HYPOTHESIS_TESTING_MODEL_V1
Purpose : A Program for Hypothesis Testing Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 13:02 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Hypothesis Testing Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg._
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.stat.Statistics
import org.apache.spark.mllib.stat.test.ChiSqTestResult
import org.apache.spark.rdd.RDD
// a vector composed of the frequencies of events
val vAR_CSLAB_vec: Vector = Vectors.dense(0.1, 0.15, 0.2, 0.3, 0.25)
// compute the goodness of fit. If a second vector to test against is not supplied
// as a parameter, the test runs against a uniform distribution.
val vAR_CSLAB_goodnessOfFitTestResult = Statistics.chiSqTest(vAR_CSLAB_vec)
// summary of the test including the p-value, degrees of freedom, test statistic, the method
// used, and the null hypothesis.
println(s"$vAR_CSLAB_goodnessOfFitTestResult\n")
// a contingency matrix. Create a dense matrix ((1.0, 2.0), (3.0, 4.0), (5.0, 6.0))
val vAR_CSLAB_mat: Matrix = Matrices.dense(3, 2, Array(1.0, 3.0, 5.0, 2.0, 4.0, 6.0))
// conduct Pearson's independence test on the input contingency matrix
val vAR_CSLAB_independenceTestResult = Statistics.chiSqTest(vAR_CSLAB_mat)
// summary of the test including the p-value, degrees of freedom
println(s"$vAR_CSLAB_independenceTestResult\n")
val vAR_CSLAB_obs: RDD[LabeledPoint] =
sc.parallelize(
Seq(
LabeledPoint(1.0, Vectors.dense(1.0, 0.0, 3.0)),
LabeledPoint(1.0, Vectors.dense(1.0, 2.0, 0.0)),
LabeledPoint(-1.0, Vectors.dense(-1.0, 0.0, -0.5)
)
)
) // (feature, label) pairs.
val vAR_CSLAB_featureTestResults: Array[ChiSqTestResult] = Statistics.chiSqTest(vAR_CSLAB_obs)
vAR_CSLAB_featureTestResults.zipWithIndex.foreach { case (k, v) =>
println("Column " + (v + 1).toString + ":")
println(k)
} // summary of the test
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_RANDOM_DATA_GENERATION_MODEL_V1
Purpose : A Program for Random Number Generation Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 13:34 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Random Number Generation Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_r = scala.util.Random
vAR_CSLAB_r.nextInt
vAR_CSLAB_r.nextInt(100)
vAR_CSLAB_r.nextFloat
vAR_CSLAB_r.nextDouble
val vAR_CSLAB_r1 = new scala.util.Random(100)
// random characters
vAR_CSLAB_r1.nextPrintableChar
vAR_CSLAB_r1.nextPrintableChar
for (i <- 0 to vAR_CSLAB_r1.nextInt(10)) yield vAR_CSLAB_r1.nextPrintableChar
// create a random length range
var vAR_CSLAB_range = 0 to vAR_CSLAB_r1.nextInt(10)
vAR_CSLAB_range = 0 to vAR_CSLAB_r1.nextInt(10)
for (i <- 0 to vAR_CSLAB_r1.nextInt(10)) yield i * 2
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_KERNEL_DENSITY_ESTIMATION_MODEL_V1
Purpose : A Program for Kernel Density Estimation Model Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 13:59 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Kernel Density Estimation Model Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.stat.KernelDensity
import org.apache.spark.rdd.RDD
// an RDD of sample data
val vAR_CSLAB_data: RDD[Double] = sc.parallelize(Seq(1, 1, 1, 2, 3, 4, 5, 5, 6, 7, 8, 9, 9))
// Construct the density estimator with the sample data and a standard deviation
// for the Gaussian kernels
val vAR_CSLAB_kd = new KernelDensity()
.setSample(vAR_CSLAB_data)
.setBandwidth(3.0)
// Find density estimates for the given values
val vAR_CSLAB_densities = vAR_CSLAB_kd.estimate(Array(-1.0, 2.0, 5.0))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_GAUSSIAN_MIXTURE_MODEL_V1
Purpose : A Program for Gaussian Mixture Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 14:22 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Gaussian Mixture Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.clustering.GaussianMixture
import org.apache.spark.mllib.clustering.GaussianMixtureModel
import org.apache.spark.mllib.linalg.Vectors
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "gmm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map(s => Vectors.dense(s.trim.split(' ').map(_.toDouble))).cache()
// Cluster the data into two classes using GaussianMixture
val vAR_CSLAB_gmm = new GaussianMixture().setK(2).run(vAR_CSLAB_parsedData)
// Save and load model
vAR_CSLAB_gmm.save(sc, "myGMMModel")
val vAR_CSLAB_sameModel = GaussianMixtureModel.load(sc, "myGMMModel")
// output parameters of max-likelihood model
for (i <- 0 until vAR_CSLAB_gmm.k) {
println("weight=%f\nmu=%s\nsigma=\n%s\n" format
(vAR_CSLAB_gmm.weights(i), vAR_CSLAB_gmm.gaussians(i).mu, vAR_CSLAB_gmm.gaussians(i).sigma))
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_KMEANS_MODEL_V1
Purpose : A Program for K-Means Clustering Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 14:56 hrs
Version : 1.0
/*****************************
## Program Description : A Program for K-Means Clustering Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.clustering.{KMeans, KMeansModel}
import org.apache.spark.mllib.linalg.Vectors
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "kmeans_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
// Cluster the data into two classes using KMeans
val vAR_CSLAB_numClusters = 2
val vAR_CSLAB_numIterations = 20
val vAR_CSLAB_clusters = KMeans.train(vAR_CSLAB_parsedData, vAR_CSLAB_numClusters, vAR_CSLAB_numIterations)
// Evaluate clustering by computing Within Set Sum of Squared Errors
val vAR_CSLAB_WSSSE = vAR_CSLAB_clusters.computeCost(vAR_CSLAB_parsedData)
println("Within Set Sum of Squared Errors = " + vAR_CSLAB_WSSSE)
// Save and load model
vAR_CSLAB_clusters.save(sc, "myModelPath")
val vAR_CSLAB_sameModel = KMeansModel.load(sc, "myModelPath")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LINEAR_REGRESSION_MODEL_V1
Purpose : A Program for Linear Regression Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 15:31 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Linear Regression Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
val vAR_CSLAB_conf = new SparkConf().setAppName("LinearRegression")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { vAR_CSLAB_line =>
val vAR_CSLAB_parts = vAR_CSLAB_line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println(s"training Mean Squared Error $vAR_CSLAB_MSE")
// Save and load model
vAR_CSLAB_model.save(sc, "LinearRegression")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LOGISTIC_REGRESSION_MODEL_V1
Purpose : A Program for Logistic Regression Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 16:05 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Logistic Regression Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.classification.{LogisticRegressionModel, LogisticRegressionWithLBFGS}
import org.apache.spark.mllib.evaluation.MulticlassMetrics
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load training data in LIBSVM format.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split data into training (60%) and test (40%).
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// Run training algorithm to build the model
val vAR_CSLAB_model = new LogisticRegressionWithLBFGS().setNumClasses(10).run(vAR_CSLAB_training)
// Compute raw scores on the test set.
val vAR_CSLAB_predictionAndLabels = vAR_CSLAB_test.map { case LabeledPoint(label, features) =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(features)
(vAR_CSLAB_prediction, label)
}
// Get evaluation metrics.
val vAR_CSLAB_metrics = new MulticlassMetrics(vAR_CSLAB_predictionAndLabels)
val vAR_CSLAB_accuracy = vAR_CSLAB_metrics.accuracy
println(s"Accuracy = $vAR_CSLAB_accuracy")
// Save and load model
vAR_CSLAB_model.save(sc, "target/tmp/scalaLogisticRegressionWithLBFGSModel1")
val vAR_CSLAB_sameModel = LogisticRegressionModel.load(sc,"target/tmp/scalaLogisticRegressionWithLBFGSModel1")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_DECISION_TREE_MODEL_V1
Purpose : A Program for Decision Tree Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 16:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Decision Tree Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.DecisionTree
import org.apache.spark.mllib.tree.model.DecisionTreeModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a DecisionTree model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 5
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = DecisionTree.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count.toDouble / vAR_CSLAB_testData.count()
println("Test Error = " + vAR_CSLAB_testErr)
println("Learned classification tree model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath1")
val vAR_CSLAB_sameModel = DecisionTreeModel.load(sc, "myModelPath1")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_REGRESSION_USING_RANDOM_FOREST_MODEL_V1
Purpose : A Program for Regression Using Random Forest Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 17:22 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Regression Using Random Forest Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count.toDouble / vAR_CSLAB_testData.count()
println("Test Error = " + vAR_CSLAB_testErr)
println("Learned classification forest model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath2")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "myModelPath2")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CLASSIFICATION_USING_RANDOM_FOREST_MODEL_V1
Purpose : A Program for Classification Using Random Forest Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 18:07 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Classification Using Random Forest Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "variance"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainRegressor(vAR_CSLAB_trainingData, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelsAndPredictions = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testMSE = vAR_CSLAB_labelsAndPredictions.map{ case(v, p) => math.pow((v - p), 2)}.mean()
println("Test Mean Squared Error = " + vAR_CSLAB_testMSE)
println("Learned regression forest model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath3")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "myModelPath3")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_REGRESSION_USING_GRADIENT_BOOSTED_TREES_MODEL_V1
Purpose : A Program for Regression Using Gradient Boosted Trees Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 18:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Classification Using Gradient Boosted Trees Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.GradientBoostedTrees
import org.apache.spark.mllib.tree.configuration.BoostingStrategy
import org.apache.spark.mllib.tree.model.GradientBoostedTreesModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a GradientBoostedTrees model.
// The defaultParams for Regression use SquaredError by default.
val vAR_CSLAB_boostingStrategy = BoostingStrategy.defaultParams("Regression")
vAR_CSLAB_boostingStrategy.numIterations = 3 // Note: Use more iterations in practice.
vAR_CSLAB_boostingStrategy.treeStrategy.maxDepth = 5
// Empty categoricalFeaturesInfo indicates all features are continuous.
vAR_CSLAB_boostingStrategy.treeStrategy.categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_model = GradientBoostedTrees.train(vAR_CSLAB_trainingData, vAR_CSLAB_boostingStrategy)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelsAndPredictions = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testMSE = vAR_CSLAB_labelsAndPredictions.map{ case(v, p) => math.pow((v - p), 2)}.mean()
println("Test Mean Squared Error = " + vAR_CSLAB_testMSE)
println("Learned regression GBT model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath4")
val vAR_CSLAB_sameModel = GradientBoostedTreesModel.load(sc, "myModelPath4")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_CLASSIFICATION_USING_GRADIENT_BOOSTED_TREES_MODEL_V1
Purpose : A Program for Classification Using Gradient Boosted Trees Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2109 19:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Classification Using Gradient Boosted Trees Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.GradientBoostedTrees
import org.apache.spark.mllib.tree.configuration.BoostingStrategy
import org.apache.spark.mllib.tree.model.GradientBoostedTreesModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a GradientBoostedTrees model.
// The defaultParams for Classification use LogLoss by default.
val vAR_CSLAB_boostingStrategy = BoostingStrategy.defaultParams("Classification")
vAR_CSLAB_boostingStrategy.numIterations = 3 // Note: Use more iterations in practice.
vAR_CSLAB_boostingStrategy.treeStrategy.numClasses = 2
vAR_CSLAB_boostingStrategy.treeStrategy.maxDepth = 5
// Empty categoricalFeaturesInfo indicates all features are continuous.
vAR_CSLAB_boostingStrategy.treeStrategy.categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_model = GradientBoostedTrees.train(vAR_CSLAB_trainingData, vAR_CSLAB_boostingStrategy)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count.toDouble / vAR_CSLAB_testData.count()
println("Test Error = " + vAR_CSLAB_testErr)
println("Learned classification GBT model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath6")
val vAR_CSLAB_sameModel = GradientBoostedTreesModel.load(sc, "myModelPath6")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_COLLABORATIVE_FILTERING_MODEL_V1
Purpose : A Program for Collaborative Filtering Model in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2109 9:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Collaborative Filtering Model in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.recommendation.ALS
import org.apache.spark.mllib.recommendation.MatrixFactorizationModel
import org.apache.spark.mllib.recommendation.Rating
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "/als/test.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_ratings = vAR_CSLAB_data.map(_.split(',') match { case Array(user, item, rate) =>
Rating(user.toInt, item.toInt, rate.toDouble)
})
// Build the recommendation model using ALS
val vAR_CSLAB_rank = 10
val vAR_CSLAB_numIterations = 10
val vAR_CSLAB_model = ALS.train(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, 0.01)
// Evaluate the model on rating data
val vAR_CSLAB_usersProducts = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
(user, product)
}
val vAR_CSLAB_predictions =
vAR_CSLAB_model.predict(vAR_CSLAB_usersProducts).map { case Rating(user, product, rate) =>
((user, product), rate)
}
val vAR_CSLAB_ratesAndPreds = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
((user, product), rate)
}.join(vAR_CSLAB_predictions)
val vAR_CSLAB_MSE = vAR_CSLAB_ratesAndPreds.map { case ((user, product), (r1, r2)) =>
val vAR_CSLAB_err = (r1 - r2)
vAR_CSLAB_err * vAR_CSLAB_err
}.mean()
println("Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath7")
val vAR_CSLAB_sameModel = MatrixFactorizationModel.load(sc, "myModelPath7")
//If the rating matrix is derived from another source of information (e.g., it is inferred from other signals), you can use the trainImplicit method to get better results.
val vAR_CSLAB_alpha = 0.01
val vAR_CSLAB_lambda = 0.01
val vAR_CSLAB_model1 = ALS.trainImplicit(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, vAR_CSLAB_lambda, vAR_CSLAB_alpha)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LINEAR_SUPPORT_VECTOR_MACHINES_V1
Purpose : A Program for Linear Support Vector Machines in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 10:21 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Linear Support Vector Machines in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.classification.{SVMModel, SVMWithSGD}
import org.apache.spark.mllib.evaluation.BinaryClassificationMetrics
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_model = SVMWithSGD.train(vAR_CSLAB_training, vAR_CSLAB_numIterations)
vAR_CSLAB_model.clearThreshold()
val vAR_CSLAB_scoreAndLabels = vAR_CSLAB_test.map { vAR_CSLAB_point =>
val vAR_CSLAB_score = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_score, vAR_CSLAB_point.label)
}
val vAR_CSLAB_metrics = new BinaryClassificationMetrics(vAR_CSLAB_scoreAndLabels)
val vAR_CSLAB_auROC = vAR_CSLAB_metrics.areaUnderROC()
println("Area under ROC = " + vAR_CSLAB_auROC)
vAR_CSLAB_model.save(sc, "myModelPath8")
val vAR_CSLAB_sameModel = SVMModel.load(sc, "myModelPath8")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LINEAR_LEAST_SQUARES_V1
Purpose : A Program for Linear Lease Squares in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 10:54 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Linear Lease Squares in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
import org.apache.spark.mllib.linalg.Vectors
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { vAR_CSLAB_line =>
val vAR_CSLAB_parts = vAR_CSLAB_line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{case(v, p) => math.pow((v - p), 2)}.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath9")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "myModelPath9")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_MULTI_LABEL_METRICS_V1
Purpose : A Program for Multi Label Metrics in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 11:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Multi Label Metrics in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.evaluation.MultilabelMetrics
import org.apache.spark.rdd.RDD
object MultiLabelMetricsExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("MultiLabelMetricsExample")
//val sc = new SparkContext(conf)
val vAR_CSLAB_scoreAndLabels: RDD[(Array[Double], Array[Double])] = sc.parallelize(
Seq((Array(0.0, 1.0), Array(0.0, 2.0)),
(Array(0.0, 2.0), Array(0.0, 1.0)),
(Array.empty[Double], Array(0.0)),
(Array(2.0), Array(2.0)),
(Array(2.0, 0.0), Array(2.0, 0.0)),
(Array(0.0, 1.0, 2.0), Array(0.0, 1.0)),
(Array(1.0), Array(1.0, 2.0))), 2)
val vAR_CSLAB_metrics = new MultilabelMetrics(vAR_CSLAB_scoreAndLabels)
println(s"Recall = ${vAR_CSLAB_metrics.recall}")
println(s"Precision = ${vAR_CSLAB_metrics.precision}")
println(s"F1 measure = ${vAR_CSLAB_metrics.f1Measure}")
println(s"Accuracy = ${vAR_CSLAB_metrics.accuracy}")
vAR_CSLAB_metrics.labels.foreach(label =>
println(s"Class $label precision = ${vAR_CSLAB_metrics.precision(label)}"))
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label recall = ${vAR_CSLAB_metrics.recall(label)}"))
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label F1-score = ${vAR_CSLAB_metrics.f1Measure(label)}"))
println(s"Micro recall = ${vAR_CSLAB_metrics.microRecall}")
println(s"Micro precision = ${vAR_CSLAB_metrics.microPrecision}")
println(s"Micro F1 measure = ${vAR_CSLAB_metrics.microF1Measure}")
println(s"Hamming loss = ${vAR_CSLAB_metrics.hammingLoss}")
println(s"Subset accuracy = ${vAR_CSLAB_metrics.subsetAccuracy}")
}
}
MultiLabelMetricsExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_MULTICLASS_METRICS_V1
Purpose : A Program for Multi Class Metrics in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 11:51 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Multi Class Metrics in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS
import org.apache.spark.mllib.evaluation.MulticlassMetrics
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
object MulticlassMetricsExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("MulticlassMetricsExample")
//val sc = new SparkContext(conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_multiclass_classification_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
val Array(vAR_CSLAB_training, vAR_CSLAB_test) = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
vAR_CSLAB_training.cache()
val vAR_CSLAB_model = new LogisticRegressionWithLBFGS()
.setNumClasses(3)
.run(vAR_CSLAB_training)
val vAR_CSLAB_predictionAndLabels = vAR_CSLAB_test.map { case LabeledPoint(vAR_CSLAB_label, vAR_CSLAB_features) =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_features)
(vAR_CSLAB_prediction, vAR_CSLAB_label)
}
val vAR_CSLAB_metrics = new MulticlassMetrics(vAR_CSLAB_predictionAndLabels)
println("Confusion matrix:")
println(vAR_CSLAB_metrics.confusionMatrix)
val vAR_CSLAB_precision = vAR_CSLAB_metrics.precision
val vAR_CSLAB_recall = vAR_CSLAB_metrics.recall // same as true positive rate
val vAR_CSLAB_f1Score = vAR_CSLAB_metrics.fMeasure
println("Summary Statistics")
println(s"Precision = $vAR_CSLAB_precision")
println(s"Recall = $vAR_CSLAB_recall")
println(s"F1 Score = $vAR_CSLAB_f1Score")
val vAR_CSLAB_labels = vAR_CSLAB_metrics.labels
vAR_CSLAB_labels.foreach { l =>
println(s"Precision($l) = " + vAR_CSLAB_metrics.precision(l))
}
vAR_CSLAB_labels.foreach { l =>
println(s"Recall($l) = " + vAR_CSLAB_metrics.recall(l))
}
vAR_CSLAB_labels.foreach { l =>
println(s"FPR($l) = " + vAR_CSLAB_metrics.falsePositiveRate(l))
}
vAR_CSLAB_labels.foreach { l =>
println(s"F1-Score($l) = " + vAR_CSLAB_metrics.fMeasure(l))
}
println(s"Weighted precision: ${vAR_CSLAB_metrics.weightedPrecision}")
println(s"Weighted recall: ${vAR_CSLAB_metrics.weightedRecall}")
println(s"Weighted F1 score: ${vAR_CSLAB_metrics.weightedFMeasure}")
println(s"Weighted false positive rate: ${vAR_CSLAB_metrics.weightedFalsePositiveRate}")
}
}
MulticlassMetricsExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_POWER_ITERATION_CLUSTERING_V1
Purpose : A Program for Power Iteration Clustering in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 12:19 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Power Iteration Clustering in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.clustering.{PowerIterationClustering, PowerIterationClusteringModel}
import org.apache.spark.mllib.linalg.Vectors
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "pic_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_similarities = vAR_CSLAB_data.map { vAR_CSLAB_line =>
val vAR_CSLAB_parts = vAR_CSLAB_line.split(' ')
(vAR_CSLAB_parts(0).toLong, vAR_CSLAB_parts(1).toLong, vAR_CSLAB_parts(2).toDouble)
}
// Cluster the data into two classes using PowerIterationClustering
val vAR_CSLAB_pic = new PowerIterationClustering()
.setK(2)
.setMaxIterations(10)
val vAR_CSLAB_model = vAR_CSLAB_pic.run(vAR_CSLAB_similarities)
vAR_CSLAB_model.assignments.foreach { vAR_CSLAB_a =>
println(s"${vAR_CSLAB_a.id} -> ${vAR_CSLAB_a.cluster}")
}
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath10")
val vAR_CSLAB_sameModel = PowerIterationClusteringModel.load(sc, "myModelPath10")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LATENT_DIRICHLET_ALLOCATION_V1
Purpose : A Program for Latent Dirchlet Allocation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 12:47 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Latent Dirchlet Allocation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.clustering.{LDA, DistributedLDAModel}
import org.apache.spark.mllib.linalg.Vectors
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_lda_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map(s => Vectors.dense(s.trim.split(' ').map(_.toDouble)))
// Index documents with unique IDs
val vAR_CSLAB_corpus = vAR_CSLAB_parsedData.zipWithIndex.map(_.swap).cache()
// Cluster the documents into three topics using LDA
val vAR_CSLAB_ldaModel = new LDA().setK(3).run(vAR_CSLAB_corpus)
// Output topics. Each is a distribution over words (matching word count vectors)
println("Learned topics (as distributions over vocab of " + vAR_CSLAB_ldaModel.vocabSize + " words):")
val vAR_CSLAB_topics = vAR_CSLAB_ldaModel.topicsMatrix
for (vAR_CSLAB_topic <- Range(0, 3)) {
print("Topic " + vAR_CSLAB_topic + ":")
for (vAR_CSLAB_word <- Range(0, vAR_CSLAB_ldaModel.vocabSize)) { print(" " + vAR_CSLAB_topics(vAR_CSLAB_word, vAR_CSLAB_topic)); }
println()
}
// Save and load model.
vAR_CSLAB_ldaModel.save(sc, "myLDAModel11")
val vAR_CSLAB_sameModel = DistributedLDAModel.load(sc, "myLDAModel11")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_MULTI_LAYER_PERCEPTRON_V1
Purpose : A Program for Multilayer Perceptron in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 13:19 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Multilayer Perceptron in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.ml.classification.MultilayerPerceptronClassifier
import org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator
import org.apache.spark.mllib.util.MLUtils
import org.apache.spark.sql.Row
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_multiclass_classification_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load training data
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into train and test
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 1234L)
val vAR_CSLAB_train = vAR_CSLAB_splits(0)
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// specify layers for the neural network:
// input layer of size 4 (features), two intermediate of size 5 and 4 and output of size 3 (classes)
val vAR_CSLAB_layers = Array[Int](4, 5, 4, 3)
// create the trainer and set its parameters
val vAR_CSLAB_trainer = new MultilayerPerceptronClassifier()
.setLayers(vAR_CSLAB_layers)
.setBlockSize(128)
.setSeed(1234L)
.setMaxIter(100)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_FREQUENT_PATTERN_MINING_GROWTH_V1
Purpose : A Program for Frequent Pattern Matching Growth in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 14:07 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Frequent Pattern Matching Growth in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.rdd.RDD
import org.apache.spark.mllib.fpm.FPGrowth
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_fpgrowth.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_transactions: RDD[Array[String]] = vAR_CSLAB_data.map(s => s.trim.split(' '))
val vAR_CSLAB_fpg = new FPGrowth()
.setMinSupport(0.2)
.setNumPartitions(10)
val vAR_CSLAB_model = vAR_CSLAB_fpg.run(vAR_CSLAB_transactions)
vAR_CSLAB_model.freqItemsets.collect().foreach { vAR_CSLAB_itemset =>
println(vAR_CSLAB_itemset.items.mkString("[", ",", "]") + ", " + vAR_CSLAB_itemset.freq)
}
val vAR_CSLAB_minConfidence = 0.8
vAR_CSLAB_model.generateAssociationRules(vAR_CSLAB_minConfidence).collect().foreach { vAR_CSLAB_rule =>
println(
vAR_CSLAB_rule.antecedent.mkString("[", ",", "]")
+ " => " + vAR_CSLAB_rule.consequent .mkString("[", ",", "]")
+ ", " + vAR_CSLAB_rule.confidence)
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ASSOCIATION_RULES_V1
Purpose : A Program for Association Rules in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 14:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Association Rules in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.rdd.RDD
import org.apache.spark.mllib.fpm.AssociationRules
import org.apache.spark.mllib.fpm.FPGrowth.FreqItemset
val vAR_CSLAB_freqItemsets = sc.parallelize(Seq(
new FreqItemset(Array("a"), 15L),
new FreqItemset(Array("b"), 35L),
new FreqItemset(Array("a", "b"), 12L)
));
val vAR_CSLAB_ar = new AssociationRules()
.setMinConfidence(0.8)
val vAR_CSLAB_results = vAR_CSLAB_ar.run(vAR_CSLAB_freqItemsets)
vAR_CSLAB_results.collect().foreach { vAR_CSLAB_rule =>
println("[" + vAR_CSLAB_rule.antecedent.mkString(",")
+ "=>"
+ vAR_CSLAB_rule.consequent.mkString(",") + "]," + vAR_CSLAB_rule.confidence)
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PREFIXSPAN_ALGORITHM_V1
Purpose : A Program for PrefixSpan Algorithm in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 14:54 hrs
Version : 1.0
/*****************************
## Program Description : A Program for PrefixSpan Algorithm in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.fpm.PrefixSpan
val vAR_CSLAB_sequences = sc.parallelize(Seq(
Array(Array(1, 2), Array(3)),
Array(Array(1), Array(3, 2), Array(1, 2)),
Array(Array(1, 2), Array(5)),
Array(Array(6))
), 2).cache()
val vAR_CSLAB_prefixSpan = new PrefixSpan()
.setMinSupport(0.5)
.setMaxPatternLength(5)
val vAR_CSLAB_model = vAR_CSLAB_prefixSpan.run(vAR_CSLAB_sequences)
vAR_CSLAB_model.freqSequences.collect().foreach { vAR_CSLAB_freqSequence =>
println(
vAR_CSLAB_freqSequence.sequence.map(_.mkString("[", ", ", "]")).mkString("[", ", ", "]") + ", " + vAR_CSLAB_freqSequence.freq)
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_BINARY_CLASSIFICATION_THRESHOLD_TUNING_V1
Purpose : A Program for Binary Classification - Threshold Tuning in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 15:19 hrs
Version : 1.0
/*****************************
## Program Description : A Program for PrefixSpan Algorithm in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS
import org.apache.spark.mllib.evaluation.BinaryClassificationMetrics
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_binary_classification_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load training data in LIBSVM format
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split data into training (60%) and test (40%)
val Array(vAR_CSLAB_training, vAR_CSLAB_test) = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
vAR_CSLAB_training.cache()
// Run training algorithm to build the model
val vAR_CSLAB_model = new LogisticRegressionWithLBFGS()
.setNumClasses(2)
.run(vAR_CSLAB_training)
// Clear the prediction threshold so the model will return probabilities
vAR_CSLAB_model.clearThreshold
// Compute raw scores on the test set
val vAR_CSLAB_predictionAndLabels = vAR_CSLAB_test.map { case LabeledPoint(label, features) =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(features)
(vAR_CSLAB_prediction, label)
}
// Instantiate metrics object
val vAR_CSLAB_metrics = new BinaryClassificationMetrics(vAR_CSLAB_predictionAndLabels)
// Precision by threshold
val vAR_CSLAB_precision = vAR_CSLAB_metrics.precisionByThreshold
vAR_CSLAB_precision.foreach { case (t, p) =>
println(s"Threshold: $t, Precision: $p")
}
// Recall by threshold
val vAR_CSLAB_recall = vAR_CSLAB_metrics.recallByThreshold
vAR_CSLAB_recall.foreach { case (t, r) =>
println(s"Threshold: $t, Recall: $r")
}
// Precision-Recall Curve
val vAR_CSLAB_PRC = vAR_CSLAB_metrics.pr
// F-measure
val vAR_CSLAB_f1Score = vAR_CSLAB_metrics.fMeasureByThreshold
vAR_CSLAB_f1Score.foreach { case (t, f) =>
println(s"Threshold: $t, F-score: $f, Beta = 1")
}
val vAR_CSLAB_beta = 0.5
val vAR_CSLAB_fScore = vAR_CSLAB_metrics.fMeasureByThreshold(vAR_CSLAB_beta)
vAR_CSLAB_f1Score.foreach { case (t, f) =>
println(s"Threshold: $t, F-score: $f, Beta = 0.5")
}
// AUPRC
val vAR_CSLAB_auPRC = vAR_CSLAB_metrics.areaUnderPR
println("Area under precision-recall curve = " + vAR_CSLAB_auPRC)
// Compute thresholds used in ROC and PR curves
val vAR_CSLAB_thresholds = vAR_CSLAB_precision.map(_._1)
// ROC Curve
val vAR_CSLAB_roc = vAR_CSLAB_metrics.roc
// AUROC
val vAR_CSLAB_auROC = vAR_CSLAB_metrics.areaUnderROC
println("Area under ROC = " + vAR_CSLAB_auROC)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_MULTICLASS_CLASSIFICATION_V1
Purpose : A Program for Multiclass Classification in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 15:47 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Multiclass Classification in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.classification.LogisticRegressionWithLBFGS
import org.apache.spark.mllib.evaluation.MulticlassMetrics
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_multiclass_classification_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load training data in LIBSVM format
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split data into training (60%) and test (40%)
val Array(vAR_CSLAB_training, vAR_CSLAB_test) = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
vAR_CSLAB_training.cache()
// Run training algorithm to build the model
val vAR_CSLAB_model = new LogisticRegressionWithLBFGS()
.setNumClasses(3)
.run(vAR_CSLAB_training)
// Compute raw scores on the test set
val vAR_CSLAB_predictionAndLabels = vAR_CSLAB_test.map { case LabeledPoint(label, features) =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(features)
(vAR_CSLAB_prediction, label)
}
// Instantiate metrics object
val vAR_CSLAB_metrics = new MulticlassMetrics(vAR_CSLAB_predictionAndLabels)
// Confusion matrix
println("Confusion matrix:")
println(vAR_CSLAB_metrics.confusionMatrix)
// Overall Statistics
val vAR_CSLAB_precision = vAR_CSLAB_metrics.precision
val vAR_CSLAB_recall = vAR_CSLAB_metrics.recall // same as true positive rate
val vAR_CSLAB_f1Score = vAR_CSLAB_metrics.fMeasure
println("Summary Statistics")
println(s"Precision = $vAR_CSLAB_precision")
println(s"Recall = $vAR_CSLAB_recall")
println(s"F1 Score = $vAR_CSLAB_f1Score")
// Precision by label
val vAR_CSLAB_labels = vAR_CSLAB_metrics.labels
vAR_CSLAB_labels.foreach { l =>
println(s"Precision($l) = " + vAR_CSLAB_metrics.precision(l))
}
// Recall by label
vAR_CSLAB_labels.foreach { l =>
println(s"Recall($l) = " + vAR_CSLAB_metrics.recall(l))
}
// False positive rate by label
vAR_CSLAB_labels.foreach { l =>
println(s"FPR($l) = " + vAR_CSLAB_metrics.falsePositiveRate(l))
}
// F-measure by label
vAR_CSLAB_labels.foreach { l =>
println(s"F1-Score($l) = " + vAR_CSLAB_metrics.fMeasure(l))
}
// Weighted stats
println(s"Weighted precision: ${vAR_CSLAB_metrics.weightedPrecision}")
println(s"Weighted recall: ${vAR_CSLAB_metrics.weightedRecall}")
println(s"Weighted F1 score: ${vAR_CSLAB_metrics.weightedFMeasure}")
println(s"Weighted false positive rate: ${vAR_CSLAB_metrics.weightedFalsePositiveRate}")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_MULTILABEL_CLASSIFICATION_V1
Purpose : A Program for Multilabel Classification in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 16:16 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Multiclass Classification in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.evaluation.MultilabelMetrics
import org.apache.spark.rdd.RDD;
val vAR_CSLAB_scoreAndLabels: RDD[(Array[Double], Array[Double])] = sc.parallelize(
Seq((Array(0.0, 1.0), Array(0.0, 2.0)),
(Array(0.0, 2.0), Array(0.0, 1.0)),
(Array(), Array(0.0)),
(Array(2.0), Array(2.0)),
(Array(2.0, 0.0), Array(2.0, 0.0)),
(Array(0.0, 1.0, 2.0), Array(0.0, 1.0)),
(Array(1.0), Array(1.0, 2.0))), 2)
// Instantiate metrics object
val vAR_CSLAB_metrics = new MultilabelMetrics(vAR_CSLAB_scoreAndLabels)
// Summary stats
println(s"Recall = ${vAR_CSLAB_metrics.recall}")
println(s"Precision = ${vAR_CSLAB_metrics.precision}")
println(s"F1 measure = ${vAR_CSLAB_metrics.f1Measure}")
println(s"Accuracy = ${vAR_CSLAB_metrics.accuracy}")
// Individual label stats
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label precision = ${vAR_CSLAB_metrics.precision(label)}"))
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label recall = ${vAR_CSLAB_metrics.recall(label)}"))
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label F1-score = ${vAR_CSLAB_metrics.f1Measure(label)}"))
// Micro stats
println(s"Micro recall = ${vAR_CSLAB_metrics.microRecall}")
println(s"Micro precision = ${vAR_CSLAB_metrics.microPrecision}")
println(s"Micro F1 measure = ${vAR_CSLAB_metrics.microF1Measure}")
// Hamming loss
println(s"Hamming loss = ${vAR_CSLAB_metrics.hammingLoss}")
// Subset accuracy
println(s"Subset accuracy = ${vAR_CSLAB_metrics.subsetAccuracy}")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_RANKING_ALGORITHM_V1
Purpose : A Program for Ranking Alogorithm in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 16:48 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Multiclass Classification in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.evaluation.{RegressionMetrics, RankingMetrics}
import org.apache.spark.mllib.recommendation.{ALS, Rating}
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_movielens_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Read in the ratings data
val vAR_CSLAB_ratings = sc.textFile(vAR_CSLAB_FILE_PATH).map { line =>
val vAR_CSLAB_fields = line.split("::")
Rating(vAR_CSLAB_fields(0).toInt, vAR_CSLAB_fields(1).toInt, vAR_CSLAB_fields(2).toDouble - 2.5)
}.cache()
// Map ratings to 1 or 0, 1 indicating a movie that should be recommended
val vAR_CSLAB_binarizedRatings = vAR_CSLAB_ratings.map(r => Rating(r.user, r.product, if (r.rating > 0) 1.0 else 0.0)).cache()
// Summarize ratings
val vAR_CSLAB_numRatings = vAR_CSLAB_ratings.count()
val vAR_CSLAB_numUsers = vAR_CSLAB_ratings.map(_.user).distinct().count()
val vAR_CSLAB_numMovies = vAR_CSLAB_ratings.map(_.product).distinct().count()
println(s"Got $vAR_CSLAB_numRatings ratings from $vAR_CSLAB_numUsers users on $vAR_CSLAB_numMovies movies.")
// Build the model
val vAR_CSLAB_numIterations = 10
val vAR_CSLAB_rank = 10
val vAR_CSLAB_lambda = 0.01
val vAR_CSLAB_model = ALS.train(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, vAR_CSLAB_lambda)
// Define a function to scale ratings from 0 to 1
def scaledRating(r: Rating): Rating = {
val scaledRating = math.max(math.min(r.rating, 1.0), 0.0)
Rating(r.user, r.product, scaledRating)
}
// Get sorted top ten predictions for each user and then scale from [0, 1]
val vAR_CSLAB_userRecommended = vAR_CSLAB_model.recommendProductsForUsers(10).map{ case (user, recs) =>
(user, recs.map(scaledRating))
}
// Assume that any movie a user rated 3 or higher (which maps to a 1) is a relevant document
// Compare with top ten most relevant documents
val vAR_CSLAB_userMovies = vAR_CSLAB_binarizedRatings.groupBy(_.user)
val vAR_CSLAB_relevantDocuments = vAR_CSLAB_userMovies.join(vAR_CSLAB_userRecommended).map{ case (user, (actual, predictions)) =>
(predictions.map(_.product), actual.filter(_.rating > 0.0).map(_.product).toArray)
}
// Instantiate metrics object
val vAR_CSLAB_metrics = new RankingMetrics(vAR_CSLAB_relevantDocuments)
// Precision at K
Array(1, 3, 5).foreach{ k =>
println(s"Precision at $k = ${vAR_CSLAB_metrics.precisionAt(k)}")
}
// Mean average precision
println(s"Mean average precision = ${vAR_CSLAB_metrics.meanAveragePrecision}")
// Normalized discounted cumulative gain
Array(1, 3, 5).foreach{ k =>
println(s"NDCG at $k = ${vAR_CSLAB_metrics.ndcgAt(k)}")
}
// Get predictions for each data point
val vAR_CSLAB_allPredictions = vAR_CSLAB_model.predict(vAR_CSLAB_ratings.map(r => (r.user, r.product))).map(r => ((r.user, r.product), r.rating))
val vAR_CSLAB_allRatings = vAR_CSLAB_ratings.map(r => ((r.user, r.product), r.rating))
val vAR_CSLAB_predictionsAndLabels = vAR_CSLAB_allPredictions.join(vAR_CSLAB_allRatings).map{ case ((user, product), (predicted, actual)) =>
(predicted, actual)
}
// Get the RMSE using regression metrics
val vAR_CSLAB_regressionMetrics = new RegressionMetrics(vAR_CSLAB_predictionsAndLabels)
println(s"RMSE = ${vAR_CSLAB_regressionMetrics.rootMeanSquaredError}")
// R-squared
println(s"R-squared = ${vAR_CSLAB_regressionMetrics.r2}")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_PMML_MODEL_EXPORT_V1
Purpose : A Program for PMML Model Export in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 17:17 hrs
Version : 1.0
/*****************************
## Program Description : A Program for PMML Model Export in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.clustering.KMeans
import org.apache.spark.mllib.linalg.Vectors
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "kmeans_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
// Cluster the data into two classes using KMeans
val vAR_CSLAB_numClusters = 2
val vAR_CSLAB_numIterations = 20
val vAR_CSLAB_clusters = KMeans.train(vAR_CSLAB_parsedData, vAR_CSLAB_numClusters, vAR_CSLAB_numIterations)
// Export to PMML
println("PMML Model:\n" + vAR_CSLAB_clusters.toPMML)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_LBFGS_OPTIMIZER_V1
Purpose : A Program for L-BFGS Optimizer in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 07/02/2019 17:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for L-BFGS Optimizer in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkContext
import org.apache.spark.mllib.evaluation.BinaryClassificationMetrics
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.util.MLUtils
import org.apache.spark.mllib.classification.LogisticRegressionModel
import org.apache.spark.mllib.optimization.{LBFGS, LogisticGradient, SquaredL2Updater}
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_numFeatures = vAR_CSLAB_data.take(1)(0).features.size
// Split data into training (60%) and test (40%).
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
// Append 1 into the training data as intercept.
val vAR_CSLAB_training = vAR_CSLAB_splits(0).map(vAR_CSLAB_x => (vAR_CSLAB_x.label, MLUtils.appendBias(vAR_CSLAB_x.features))).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// Run training algorithm to build the model
val vAR_CSLAB_numCorrections = 10
val vAR_CSLAB_convergenceTol = 1e-4
val vAR_CSLAB_maxNumIterations = 20
val vAR_CSLAB_regParam = 0.1
val vAR_CSLAB_initialWeightsWithIntercept = Vectors.dense(new Array[Double](vAR_CSLAB_numFeatures + 1))
val (weightsWithIntercept, loss) = LBFGS.runLBFGS(
vAR_CSLAB_training,
new LogisticGradient(),
new SquaredL2Updater(),
vAR_CSLAB_numCorrections,
vAR_CSLAB_convergenceTol,
vAR_CSLAB_maxNumIterations,
vAR_CSLAB_regParam,
vAR_CSLAB_initialWeightsWithIntercept)
val vAR_CSLAB_model = new LogisticRegressionModel(
Vectors.dense(weightsWithIntercept.toArray.slice(0, weightsWithIntercept.size - 1)),
weightsWithIntercept(weightsWithIntercept.size - 1))
// Clear the default threshold.
vAR_CSLAB_model.clearThreshold()
// Compute raw scores on the test set.
val vAR_CSLAB_scoreAndLabels = vAR_CSLAB_test.map { vAR_CSLAB_point =>
val vAR_CSLAB_score = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_score, vAR_CSLAB_point.label)
}
// Get evaluation metrics.
val vAR_CSLAB_metrics = new BinaryClassificationMetrics(vAR_CSLAB_scoreAndLabels)
val vAR_CSLAB_auROC = vAR_CSLAB_metrics.areaUnderROC()
println("Loss of each step in training process")
loss.foreach(println)
println("Area under ROC = " + vAR_CSLAB_auROC)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
File Name : CSLAB_ABSTRACT_PARAMETERS_V1
Purpose : A Program for Abstract Parameters in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 09:31 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Abstract Parameters in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.reflect.runtime.universe._
abstract class AbstractParams[T: TypeTag] {
private def tag: TypeTag[T] = typeTag[T]
override def toString: String = {
val vAR_CSLAB_tpe = tag.tpe
val vAR_CSLAB_allAccessors = vAR_CSLAB_tpe.decls.collect {
case m: MethodSymbol if m.isCaseAccessor => m
}
val vAR_CSLAB_mirror = runtimeMirror(getClass.getClassLoader)
val vAR_CSLAB_instanceMirror = vAR_CSLAB_mirror.reflect(this)
vAR_CSLAB_allAccessors.map { vAR_CSLAB_f =>
val vAR_CSLAB_paramName = vAR_CSLAB_f.name.toString
val vAR_CSLAB_fieldMirror = vAR_CSLAB_instanceMirror.reflectField(vAR_CSLAB_f)
val vAR_CSLAB_paramValue = vAR_CSLAB_fieldMirror.get
s" $vAR_CSLAB_paramName:\t$vAR_CSLAB_paramValue"
}.mkString("{\n", ",\n", "\n}")
}
}
toString
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CHISQUARE_SELECTOR_V1
Purpose : A Program for Chisquare Selector in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 09:59 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Chisquare Selector in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.ChiSqSelector
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
object ChiSqSelectorExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("ChiSqSelectorExample")
//val vAR_CSLAB_sc = new SparkContext(vAR_CSLAB_conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load some data in libsvm format
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Discretize data in 16 equal bins since ChiSqSelector requires categorical features
// Even though features are doubles, the ChiSqSelector treats each unique value as a category
val vAR_CSLAB_discretizedData = vAR_CSLAB_data.map { vAR_CSLAB_lp =>
LabeledPoint(vAR_CSLAB_lp.label, Vectors.dense(vAR_CSLAB_lp.features.toArray.map { vAR_CSLAB_x => (vAR_CSLAB_x / 16).floor }))
}
// Create ChiSqSelector that will select top 50 of 692 features
val vAR_CSLAB_selector = new ChiSqSelector(50)
// Create ChiSqSelector model (selecting features)
val vAR_CSLAB_transformer = vAR_CSLAB_selector.fit(vAR_CSLAB_discretizedData)
// Filter the top 50 features from each feature vector
val vAR_CSLAB_filteredData = vAR_CSLAB_discretizedData.map { vAR_CSLAB_lp =>
LabeledPoint(vAR_CSLAB_lp.label, vAR_CSLAB_transformer.transform(vAR_CSLAB_lp.features))
}
println("filtered data: ")
vAR_CSLAB_filteredData.foreach(vAR_CSLAB_x => println(vAR_CSLAB_x))
//sc.stop()
}
}
ChiSqSelectorExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CORRELATION_EXAMPLE_V1
Purpose : A Program for an Example of Correlation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 10:27 hrs
Version : 1.0
/*****************************
## Program Description : A Program for an Example of Correlation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg._
import org.apache.spark.mllib.stat.Statistics
import org.apache.spark.rdd.RDD
object CorrelationsExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("CorrelationsExample")
//val sc = new SparkContext(vAR_CSLAB_conf)
// $example on$
val vAR_CSLAB_seriesX: RDD[Double] = sc.parallelize(Array(1, 2, 3, 3, 5)) // a series
// must have the same number of partitions and cardinality as seriesX
val vAR_CSLAB_seriesY: RDD[Double] = sc.parallelize(Array(11, 22, 33, 33, 555))
// compute the correlation using Pearson's method. Enter "spearman" for Spearman's method. If a
// method is not specified, Pearson's method will be used by default.
val vAR_CSLAB_correlation: Double = Statistics.corr(vAR_CSLAB_seriesX, vAR_CSLAB_seriesY, "pearson")
println(s"Correlation is: $vAR_CSLAB_correlation")
val vAR_CSLAB_data: RDD[Vector] = sc.parallelize(
Seq(
Vectors.dense(1.0, 10.0, 100.0),
Vectors.dense(2.0, 20.0, 200.0),
Vectors.dense(5.0, 33.0, 366.0))
) // note that each Vector is a row and not a column
// calculate the correlation matrix using Pearson's method. Use "spearman" for Spearman's method
// If a method is not specified, Pearson's method will be used by default.
val vAR_CSLAB_correlMatrix: Matrix = Statistics.corr(vAR_CSLAB_data, "pearson")
println(vAR_CSLAB_correlMatrix.toString)
// $example off$
//sc.stop()
}
}
// scalastyle:on println
CorrelationsExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DECISION_TREE_CLASSIFICATION_EXAMPLE_V1
Purpose : A Program for Decision Tree Classification in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 10:49 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Decision Tree Classification in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.DecisionTree
import org.apache.spark.mllib.tree.model.DecisionTreeModel
import org.apache.spark.mllib.util.MLUtils
object DecisionTreeClassificationExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("DecisionTreeClassificationExample")
//val sc = new SparkContext(vAR_CSLAB_conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a DecisionTree model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 5
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = DecisionTree.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count().toDouble / vAR_CSLAB_testData.count()
println(s"Test Error = $vAR_CSLAB_testErr")
println(s"Learned classification tree model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "target/tmp/myDecisionTreeClassificationModel")
val vAR_CSLAB_sameModel = DecisionTreeModel.load(sc, "target/tmp/myDecisionTreeClassificationModel")
//sc.stop()
}
}
DecisionTreeClassificationExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DECISION_TREE_REGRESSION_EXAMPLE_V1
Purpose : A Program for Decision Tree Regression in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 11:16 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Decision Tree Regression in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.DecisionTree
import org.apache.spark.mllib.tree.model.DecisionTreeModel
import org.apache.spark.mllib.util.MLUtils
object DecisionTreeRegressionExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("DecisionTreeRegressionExample")
//val sc = new SparkContext(vAR_CSLAB_conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a DecisionTree model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_impurity = "variance"
val vAR_CSLAB_maxDepth = 5
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = DecisionTree.trainRegressor(vAR_CSLAB_trainingData, vAR_CSLAB_categoricalFeaturesInfo, vAR_CSLAB_impurity,
vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelsAndPredictions = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testMSE = vAR_CSLAB_labelsAndPredictions.map{ case (v, p) => math.pow(v - p, 2) }.mean()
println(s"Test Mean Squared Error = $vAR_CSLAB_testMSE")
println(s"Learned regression tree model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "target/tmp/myDecisionTreeRegressionModel")
val vAR_CSLAB_sameModel = DecisionTreeModel.load(sc, "target/tmp/myDecisionTreeRegressionModel")
//sc.stop()
}
}
DecisionTreeRegressionExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_HYPOTHESIS_TESTING_V1
Purpose : A Program for Hypothesis Testing in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 11:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Hypothesis Testing in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg._
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.stat.Statistics
import org.apache.spark.mllib.stat.test.ChiSqTestResult
import org.apache.spark.rdd.RDD
object HypothesisTestingExample {
def main(args: Array[String]) {
val vAR_CSLAB_conf = new SparkConf().setAppName("HypothesisTestingExample")
// $example on$
// a vector composed of the frequencies of events
val vAR_CSLAB_vec: Vector = Vectors.dense(0.1, 0.15, 0.2, 0.3, 0.25)
// compute the goodness of fit. If a second vector to test against is not supplied
// as a parameter, the test runs against a uniform distribution.
val vAR_CSLAB_goodnessOfFitTestResult = Statistics.chiSqTest(vAR_CSLAB_vec)
// summary of the test including the p-value, degrees of freedom, test statistic, the method
// used, and the null hypothesis.
println(s"$vAR_CSLAB_goodnessOfFitTestResult\n")
// a contingency matrix. Create a dense matrix ((1.0, 2.0), (3.0, 4.0), (5.0, 6.0))
val vAR_CSLAB_mat: Matrix = Matrices.dense(3, 2, Array(1.0, 3.0, 5.0, 2.0, 4.0, 6.0))
// conduct Pearson's independence test on the input contingency matrix
val vAR_CSLAB_independenceTestResult = Statistics.chiSqTest(vAR_CSLAB_mat)
// summary of the test including the p-value, degrees of freedom
println(s"$vAR_CSLAB_independenceTestResult\n")
val vAR_CSLAB_obs: RDD[LabeledPoint] =
sc.parallelize(
Seq(
LabeledPoint(1.0, Vectors.dense(1.0, 0.0, 3.0)),
LabeledPoint(1.0, Vectors.dense(1.0, 2.0, 0.0)),
LabeledPoint(-1.0, Vectors.dense(-1.0, 0.0, -0.5)
)
)
) // (label, feature) pairs.
// The contingency table is constructed from the raw (label, feature) pairs and used to conduct
// the independence test. Returns an array containing the ChiSquaredTestResult for every feature
// against the label.
val vAR_CSLAB_featureTestResults: Array[ChiSqTestResult] = Statistics.chiSqTest(vAR_CSLAB_obs)
vAR_CSLAB_featureTestResults.zipWithIndex.foreach { case (k, v) =>
println(s"Column ${(v + 1)} :")
println(k)
} // summary of the test
}
}
HypothesisTestingExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_ISOTONIC_REGRESSION_V1
Purpose : A Program for Isotonic Regression in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 12:16 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Isotonic Regression in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.regression.{IsotonicRegression, IsotonicRegressionModel}
import org.apache.spark.mllib.util.MLUtils
object IsotonicRegressionExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("IsotonicRegressionExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_isotonic_regression_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc,vAR_CSLAB_FILE_PATH).cache()
// Create label, feature, weight tuples from input data with weight set to default value 1.0.
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { vAR_CSLAB_labeledPoint =>
(vAR_CSLAB_labeledPoint.label, vAR_CSLAB_labeledPoint.features(0), 1.0)
}
// Split data into training (60%) and test (40%) sets.
val vAR_CSLAB_splits = vAR_CSLAB_parsedData.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0)
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// Create isotonic regression model from training data.
// Isotonic parameter defaults to true so it is only shown for demonstration
val vAR_CSLAB_model = new IsotonicRegression().setIsotonic(true).run(vAR_CSLAB_training)
// Create tuples of predicted and real labels.
val vAR_CSLAB_predictionAndLabel = vAR_CSLAB_test.map { vAR_CSLAB_point =>
val vAR_CSLAB_predictedLabel = vAR_CSLAB_model.predict(vAR_CSLAB_point._2)
(vAR_CSLAB_predictedLabel, vAR_CSLAB_point._1)
}
// Calculate mean squared error between predicted and real labels.
val vAR_CSLAB_meanSquaredError = vAR_CSLAB_predictionAndLabel.map { case (p, l) => math.pow((p - l), 2) }.mean()
println(s"Mean Squared Error = $vAR_CSLAB_meanSquaredError")
// Save and load model
vAR_CSLAB_model.save(sc, "myIsotonicRegressionModel")
val vAR_CSLAB_sameModel = IsotonicRegressionModel.load(sc, "myIsotonicRegressionModel")
}
}
IsotonicRegressionExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_KERNEL_DENSITY_ESTIMATION_V1
Purpose : A Program for Kernel Density Estimation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 12:54 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Kernel Density Estimation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.stat.KernelDensity
import org.apache.spark.rdd.RDD
object KernelDensityEstimationExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("KernelDensityEstimationExample")
// $example on$
// an RDD of sample data
val vAR_CSLAB_data: RDD[Double] = sc.parallelize(Seq(1, 1, 1, 2, 3, 4, 5, 5, 6, 7, 8, 9, 9))
// Construct the density estimator with the sample data and a standard deviation
// for the Gaussian kernels
val vAR_CSLAB_kd = new KernelDensity()
.setSample(vAR_CSLAB_data)
.setBandwidth(3.0)
// Find density estimates for the given values
val vAR_CSLAB_densities = vAR_CSLAB_kd.estimate(Array(-1.0, 2.0, 5.0))
// $example off$
vAR_CSLAB_densities.foreach(println)
}
}
KernelDensityEstimationExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_NORMALIZER_V1
Purpose : A Program for Normalizers in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 13:47 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Normalizers in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.Normalizer
import org.apache.spark.mllib.util.MLUtils
object NormalizerExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("NormalizerExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// $example on$
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc,vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_normalizer1 = new Normalizer()
val vAR_CSLAB_normalizer2 = new Normalizer(p = Double.PositiveInfinity)
// Each sample in data1 will be normalized using $L^2$ norm.
val vAR_CSLAB_data1 = vAR_CSLAB_data.map(vAR_CSLAB_x => (vAR_CSLAB_x.label, vAR_CSLAB_normalizer1.transform(vAR_CSLAB_x.features)))
// Each sample in data2 will be normalized using $L^\infty$ norm.
val vAR_CSLAB_data2 = vAR_CSLAB_data.map(vAR_CSLAB_x => (vAR_CSLAB_x.label, vAR_CSLAB_normalizer2.transform(vAR_CSLAB_x.features)))
println("vAR_CSLAB_data1: ")
vAR_CSLAB_data1.foreach(vAR_CSLAB_x => println(vAR_CSLAB_x))
println("vAR_CSLAB_data2: ")
vAR_CSLAB_data2.foreach(vAR_CSLAB_x => println(vAR_CSLAB_x))
}
}
NormalizerExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PRINCIPAL_COMPONENT_ANALYSIS_V1
Purpose : A Program for Principal Component Analysis in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 14:22 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Principal Component Analysis in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.PCA
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.{LabeledPoint, LinearRegressionWithSGD}
@deprecated("Deprecated since LinearRegressionWithSGD is deprecated. Use ml.feature.PCA", "2.0.0")
object PCAExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("PCAExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH).map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
val vAR_CSLAB_pca = new PCA(vAR_CSLAB_training.first().features.size / 2).fit(vAR_CSLAB_data.map(_.features))
val vAR_CSLAB_training_pca = vAR_CSLAB_training.map(vAR_CSLAB_p => vAR_CSLAB_p.copy(features = vAR_CSLAB_pca.transform(vAR_CSLAB_p.features)))
val vAR_CSLAB_test_pca = vAR_CSLAB_test.map(vAR_CSLAB_p => vAR_CSLAB_p.copy(features = vAR_CSLAB_pca.transform(vAR_CSLAB_p.features)))
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_training, vAR_CSLAB_numIterations)
val vAR_CSLAB_model_pca = LinearRegressionWithSGD.train(vAR_CSLAB_training_pca, vAR_CSLAB_numIterations)
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_test.map { vAR_CSLAB_point =>
val vAR_CSLAB_score = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_score, vAR_CSLAB_point.label)
}
val vAR_CSLAB_valuesAndPreds_pca = vAR_CSLAB_test_pca.map { vAR_CSLAB_point =>
val vAR_CSLAB_score = vAR_CSLAB_model_pca.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_score, vAR_CSLAB_point.label)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map { case (v, vAR_CSLAB_p) => math.pow((v - vAR_CSLAB_p), 2) }.mean()
val vAR_CSLAB_MSE_pca = vAR_CSLAB_valuesAndPreds_pca.map { case (v, vAR_CSLAB_p) => math.pow((v - vAR_CSLAB_p), 2) }.mean()
println(s"Mean Squared Error = $vAR_CSLAB_MSE")
println(s"PCA Mean Squared Error = $vAR_CSLAB_MSE_pca")
}
}
PCAExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PRINCIPAL_COMPONENT_ANALYSIS_ON_ROW_MATRIX_V1
Purpose : A Program for Principal Component Analysis on Row Matrix in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 14:59 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Principal Component Analysis on Row Matrix in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.linalg.Matrix
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.linalg.distributed.RowMatrix
object PCAOnRowMatrixExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("PCAOnRowMatrixExample")
val vAR_CSLAB_data = Array(
Vectors.sparse(5, Seq((1, 1.0), (3, 7.0))),
Vectors.dense(2.0, 0.0, 3.0, 4.0, 5.0),
Vectors.dense(4.0, 0.0, 0.0, 6.0, 7.0))
val vAR_CSLAB_rows = sc.parallelize(vAR_CSLAB_data)
val vAR_CSLAB_mat: RowMatrix = new RowMatrix(vAR_CSLAB_rows)
// Compute the top 4 principal components.
// Principal components are stored in a local dense matrix.
val vAR_CSLAB_pc: Matrix = vAR_CSLAB_mat.computePrincipalComponents(4)
// Project the rows to the linear space spanned by the top 4 principal components.
val vAR_CSLAB_projected: RowMatrix = vAR_CSLAB_mat.multiply(vAR_CSLAB_pc)
val vAR_CSLAB_collect = vAR_CSLAB_projected.rows.collect()
println("Projected Row Matrix of principal component:")
vAR_CSLAB_collect.foreach { vAR_CSLAB_vector => println(vAR_CSLAB_vector) }
}
}
PCAOnRowMatrixExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PRINCIPAL_COMPONENT_ANALYSIS_ON_SOURCE_VECTOR_V1
Purpose : A Program for Principal Component Analysis on Source Vector in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 15:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Principal Component Analysis on Source Vector in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.PCA
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.rdd.RDD
object PCAOnSourceVectorExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("PCAOnSourceVectorExample")
val vAR_CSLAB_data: RDD[LabeledPoint] = sc.parallelize(Seq(
new LabeledPoint(0, Vectors.dense(1, 0, 0, 0, 1)),
new LabeledPoint(1, Vectors.dense(1, 1, 0, 1, 0)),
new LabeledPoint(1, Vectors.dense(1, 1, 0, 0, 0)),
new LabeledPoint(0, Vectors.dense(1, 0, 0, 0, 0)),
new LabeledPoint(1, Vectors.dense(1, 1, 0, 0, 0))))
// Compute the top 5 principal components.
val vAR_CSLAB_pca = new PCA(5).fit(vAR_CSLAB_data.map(_.features))
// Project vectors to the linear space spanned by the top 5 principal
// components, keeping the label
val vAR_CSLAB_projected = vAR_CSLAB_data.map(vAR_CSLAB_p => vAR_CSLAB_p.copy(features = vAR_CSLAB_pca.transform(vAR_CSLAB_p.features)))
val vAR_CSLAB_collect = vAR_CSLAB_projected.collect()
println("Projected vector of principal component:")
vAR_CSLAB_collect.foreach { vAR_CSLAB_vector => println(vAR_CSLAB_vector) }
}
}
PCAOnSourceVectorExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_RANDOM_FOREST_CLASSIFICATION_V1
Purpose : A Program for Random Forest Classification in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 16:04 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Random Forest Classification in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
object RandomForestClassificationExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("RandomForestClassificationExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(vAR_CSLAB_r => vAR_CSLAB_r._1 != vAR_CSLAB_r._2).count.toDouble / vAR_CSLAB_testData.count()
println(s"Test Error = $vAR_CSLAB_testErr")
println(s"Learned classification forest model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "myRandomForestClassificationModel")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "myRandomForestClassificationModel")
}
}
RandomForestClassificationExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_RANDOM_FOREST_REGRESSION_V1
Purpose : A Program for Random Forest Regression in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 16:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Random Forest Regression in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
object RandomForestRegressionExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("RandomForestRegressionExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "variance"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainRegressor(vAR_CSLAB_trainingData, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelsAndPredictions = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testMSE = vAR_CSLAB_labelsAndPredictions.map{ case(v, p) => math.pow((v - p), 2)}.mean()
println(s"Test Mean Squared Error = $vAR_CSLAB_testMSE")
println(s"Learned regression forest model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "myRandomForestRegressionModel")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "myRandomForestRegressionModel")
}
}
RandomForestRegressionExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_RANDOM_RDD_GENERATION_V1
Purpose : A Program for Random RDD Generation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 17:27 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Random RDD Generation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.random.RandomRDDs
import org.apache.spark.rdd.RDD
object RandomRDDGeneration {
def main(args: Array[String]) {
val vAR_CSLAB_conf = new SparkConf().setAppName(s"RandomRDDGeneration")
val vAR_CSLAB_numExamples = 10000 // number of examples to generate
val vAR_CSLAB_fraction = 0.1 // fraction of data to sample
// Example: RandomRDDs.normalRDD
val vAR_CSLAB_normalRDD: RDD[Double] = RandomRDDs.normalRDD(sc, vAR_CSLAB_numExamples)
println(s"Generated RDD of ${vAR_CSLAB_normalRDD.count()}" +
" examples sampled from the standard normal distribution")
println(" First 5 samples:")
vAR_CSLAB_normalRDD.take(5).foreach( vAR_CSLAB_x => println(s" $vAR_CSLAB_x") )
// Example: RandomRDDs.normalVectorRDD
val vAR_CSLAB_normalVectorRDD = RandomRDDs.normalVectorRDD(sc, numRows = vAR_CSLAB_numExamples, numCols = 2)
println(s"Generated RDD of ${vAR_CSLAB_normalVectorRDD.count()} examples of length-2 vectors.")
println(" First 5 samples:")
vAR_CSLAB_normalVectorRDD.take(5).foreach( vAR_CSLAB_x => println(s" $vAR_CSLAB_x") )
println()
}
}
RandomRDDGeneration.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_RANKING_METRICS_V1
Purpose : A Program for Ranking Metrics in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 18:04 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Random RDD Generation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.evaluation.{RankingMetrics, RegressionMetrics}
import org.apache.spark.mllib.recommendation.{ALS, Rating}
import org.apache.spark.sql.SparkSession
object RankingMetricsExample {
def main(args: Array[String]) {
val vAR_CSLAB_spark = SparkSession
.builder
.appName("RankingMetricsExample")
.getOrCreate()
import spark.implicits._
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_movielens_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Read in the ratings data
val vAR_CSLAB_ratings = vAR_CSLAB_spark.read.textFile(vAR_CSLAB_FILE_PATH).rdd.map { vAR_CSLAB_line =>
val vAR_CSLAB_fields = vAR_CSLAB_line.split("::")
Rating(vAR_CSLAB_fields(0).toInt, vAR_CSLAB_fields(1).toInt, vAR_CSLAB_fields(2).toDouble - 2.5)
}.cache()
// Map ratings to 1 or 0, 1 indicating a movie that should be recommended
val vAR_CSLAB_binarizedRatings = vAR_CSLAB_ratings.map(vAR_CSLAB_r => Rating(vAR_CSLAB_r.user, vAR_CSLAB_r.product,
if (vAR_CSLAB_r.rating > 0) 1.0 else 0.0)).cache()
// Summarize ratings
val vAR_CSLAB_numRatings = vAR_CSLAB_ratings.count()
val vAR_CSLAB_numUsers = vAR_CSLAB_ratings.map(_.user).distinct().count()
val vAR_CSLAB_numMovies = vAR_CSLAB_ratings.map(_.product).distinct().count()
println(s"Got $vAR_CSLAB_numRatings ratings from $vAR_CSLAB_numUsers users on $vAR_CSLAB_numMovies movies.")
// Build the model
val vAR_CSLAB_numIterations = 10
val vAR_CSLAB_rank = 10
val vAR_CSLAB_lambda = 0.01
val vAR_CSLAB_model = ALS.train(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, vAR_CSLAB_lambda)
// Define a function to scale ratings from 0 to 1
def scaledRating(vAR_CSLAB_r: Rating): Rating = {
val vAR_CSLAB_scaledRating = math.max(math.min(vAR_CSLAB_r.rating, 1.0), 0.0)
Rating(vAR_CSLAB_r.user, vAR_CSLAB_r.product, vAR_CSLAB_scaledRating)
}
// Get sorted top ten predictions for each user and then scale from [0, 1]
val vAR_CSLAB_userRecommended = vAR_CSLAB_model.recommendProductsForUsers(10).map { case (user, recs) =>
(user, recs.map(scaledRating))
}
// Assume that any movie a user rated 3 or higher (which maps to a 1) is a relevant document
// Compare with top ten most relevant documents
val vAR_CSLAB_userMovies = vAR_CSLAB_binarizedRatings.groupBy(_.user)
val vAR_CSLAB_relevantDocuments = vAR_CSLAB_userMovies.join(vAR_CSLAB_userRecommended).map { case (user, (actual,
predictions)) =>
(predictions.map(_.product), actual.filter(_.rating > 0.0).map(_.product).toArray)
}
// Instantiate metrics object
val vAR_CSLAB_metrics = new RankingMetrics(vAR_CSLAB_relevantDocuments)
// Precision at K
Array(1, 3, 5).foreach { k =>
println(s"Precision at $k = ${vAR_CSLAB_metrics.precisionAt(k)}")
}
// Mean average precision
println(s"Mean average precision = ${vAR_CSLAB_metrics.meanAveragePrecision}")
// Normalized discounted cumulative gain
Array(1, 3, 5).foreach { k =>
println(s"NDCG at $k = ${vAR_CSLAB_metrics.ndcgAt(k)}")
}
// Get predictions for each data point
val vAR_CSLAB_allPredictions = vAR_CSLAB_model.predict(vAR_CSLAB_ratings.map(vAR_CSLAB_r => (vAR_CSLAB_r.user, vAR_CSLAB_r.product))).map(vAR_CSLAB_r => ((vAR_CSLAB_r.user,
vAR_CSLAB_r.product), vAR_CSLAB_r.rating))
val vAR_CSLAB_allRatings = vAR_CSLAB_ratings.map(vAR_CSLAB_r => ((vAR_CSLAB_r.user, vAR_CSLAB_r.product), vAR_CSLAB_r.rating)
)
val vAR_CSLAB_predictionsAndLabels = vAR_CSLAB_allPredictions.join(vAR_CSLAB_allRatings).map { case ((user, product),
(predicted, actual)) =>
(predicted, actual)
}
// Get the RMSE using regression metrics
val vAR_CSLAB_regressionMetrics = new RegressionMetrics(vAR_CSLAB_predictionsAndLabels)
println(s"RMSE = ${vAR_CSLAB_regressionMetrics.rootMeanSquaredError}")
// R-squared
println(s"R-squared = ${vAR_CSLAB_regressionMetrics.r2}")
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_RECOMMENDATION_EXAMPLE_V1
Purpose : A Program for Example of Recommendation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 08/02/2019 18:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Example of Recommendation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.recommendation.ALS
import org.apache.spark.mllib.recommendation.MatrixFactorizationModel
import org.apache.spark.mllib.recommendation.Rating
object RecommendationExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("CollaborativeFilteringExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "als/test.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_ratings = vAR_CSLAB_data.map(_.split(',') match { case Array(user, item, rate) =>
Rating(user.toInt, item.toInt, rate.toDouble)
})
// Build the recommendation model using ALS
val vAR_CSLAB_rank = 10
val vAR_CSLAB_numIterations = 10
val vAR_CSLAB_model = ALS.train(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, 0.01)
// Evaluate the model on rating data
val vAR_CSLAB_usersProducts = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
(user, product)
}
val vAR_CSLAB_predictions =
vAR_CSLAB_model.predict(vAR_CSLAB_usersProducts).map { case Rating(user, product, rate) =>
((user, product), rate)
}
val vAR_CSLAB_ratesAndPreds = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
((user, product), rate)
}.join(vAR_CSLAB_predictions)
val vAR_CSLAB_MSE = vAR_CSLAB_ratesAndPreds.map { case ((user, product), (r1, r2)) =>
val vAR_CSLAB_err = (r1 - r2)
vAR_CSLAB_err * vAR_CSLAB_err
}.mean()
println(s"Mean Squared Error = $vAR_CSLAB_MSE")
// Save and load model
vAR_CSLAB_model.save(sc, "myCollaborativeFilter")
val vAR_CSLAB_sameModel = MatrixFactorizationModel.load(sc, "myCollaborativeFilter")
}
}
RecommendationExample.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SIMPLE_FP-GROWTH_V1
Purpose : A Program for Simple FP-Growth in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 09:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Simple FP-Growth in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.fpm.FPGrowth
import org.apache.spark.rdd.RDD
object SimpleFPGrowth {
def main(args: Array[String]) {
val vAR_CSLAB_conf = new SparkConf().setAppName("SimpleFPGrowth")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_fpgrowth.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_transactions: RDD[Array[String]] = vAR_CSLAB_data.map(vAR_CSLAB_s => vAR_CSLAB_s.trim.split(' '))
val vAR_CSLAB_fpg = new FPGrowth()
.setMinSupport(0.2)
.setNumPartitions(10)
val vAR_CSLAB_model = vAR_CSLAB_fpg.run(vAR_CSLAB_transactions)
vAR_CSLAB_model.freqItemsets.collect().foreach { vAR_CSLAB_itemset =>
println(s"${vAR_CSLAB_itemset.items.mkString("[", ",", "]")},${vAR_CSLAB_itemset.freq}")
}
val vAR_CSLAB_minConfidence = 0.8
vAR_CSLAB_model.generateAssociationRules(vAR_CSLAB_minConfidence).collect().foreach { vAR_CSLAB_rule =>
println(s"${vAR_CSLAB_rule.antecedent.mkString("[", ",", "]")}=> " +
s"${vAR_CSLAB_rule.consequent .mkString("[", ",", "]")},${vAR_CSLAB_rule.confidence}")
}
}
}
SimpleFPGrowth.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STANDARD_SCALAR_V1
Purpose : A Program for Standard Scalar in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 09:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Standard Scalar in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.{StandardScaler, StandardScalerModel}
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.util.MLUtils
object StandardScalerExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("StandardScalerExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_scaler1 = new StandardScaler().fit(vAR_CSLAB_data.map(vAR_CSLAB_x => vAR_CSLAB_x.features))
val vAR_CSLAB_scaler2 = new StandardScaler(withMean = true, withStd = true).fit(vAR_CSLAB_data.map(vAR_CSLAB_x => vAR_CSLAB_x.features)
)
// scaler3 is an identical model to scaler2, and will produce identical transformations
val vAR_CSLAB_scaler3 = new StandardScalerModel(vAR_CSLAB_scaler2.std, vAR_CSLAB_scaler2.mean)
// data1 will be unit variance.
val vAR_CSLAB_data1 = vAR_CSLAB_data.map(vAR_CSLAB_x => (vAR_CSLAB_x.label, vAR_CSLAB_scaler1.transform(vAR_CSLAB_x.features)))
// data2 will be unit variance and zero mean.
val vAR_CSLAB_data2 = vAR_CSLAB_data.map(vAR_CSLAB_x => (vAR_CSLAB_x.label, vAR_CSLAB_scaler2.transform(Vectors.dense(vAR_CSLAB_x.features.toArray))))
// $example off$
println("vAR_CSLAB_data1: ")
vAR_CSLAB_data1.foreach(vAR_CSLAB_x => println(vAR_CSLAB_x))
println("vAR_CSLAB_data2: ")
vAR_CSLAB_data2.foreach(vAR_CSLAB_x => println(vAR_CSLAB_x))
}
}
StandardScalerExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STARTIFIED_SAMPLING_V1
Purpose : A Program for Stratified Sampling in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 10:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Stratified Sampling in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
object StratifiedSamplingExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("StratifiedSamplingExample")
// an RDD[(K, V)] of any key value pairs
val vAR_CSLAB_data = sc.parallelize(
Seq((1, 'a'), (1, 'b'), (2, 'c'), (2, 'd'), (2, 'e'), (3, 'f')))
// specify the exact fraction desired from each key
val vAR_CSLAB_fractions = Map(1 -> 0.1, 2 -> 0.6, 3 -> 0.3)
// Get an approximate sample from each stratum
val vAR_CSLAB_approxSample = vAR_CSLAB_data.sampleByKey(withReplacement = false, fractions = vAR_CSLAB_fractions)
// Get an exact sample from each stratum
val vAR_CSLAB_exactSample = vAR_CSLAB_data.sampleByKeyExact(withReplacement = false, fractions = vAR_CSLAB_fractions)
println(s"approxSample size is ${vAR_CSLAB_approxSample.collect().size}")
vAR_CSLAB_approxSample.collect().foreach(println)
println(s"exactSample its size is ${vAR_CSLAB_exactSample.collect().size}")
vAR_CSLAB_exactSample.collect().foreach(println)
}
}
StratifiedSamplingExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STREAMING_KMEANS_CLUSTERING_V1
Purpose : A Program for Streaming K-Means Clustering in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 10:58 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Streaming K-Means Clustering in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.mllib.clustering.StreamingKMeans
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.streaming.{Seconds, StreamingContext}
object StreamingKMeansExample {
def main(args: Array[String]) {
if (args.length != 5) {
System.err.println(
"Usage: StreamingKMeansExample " +
"<trainingDir> <testDir> <batchDuration> <numClusters> <numDimensions>")
System.exit(1)
}
// $example on$
val vAR_CSLAB_conf = new SparkConf().setAppName("StreamingKMeansExample")
val ssc = new StreamingContext(vAR_CSLAB_conf, Seconds(args(2).toLong))
val vAR_CSLAB_trainingData = ssc.textFileStream(args(0)).map(Vectors.parse)
val vAR_CSLAB_testData = ssc.textFileStream(args(1)).map(LabeledPoint.parse)
val vAR_CSLAB_model = new StreamingKMeans()
.setK(args(3).toInt)
.setDecayFactor(1.0)
.setRandomCenters(args(4).toInt, 0.0)
vAR_CSLAB_model.trainOn(vAR_CSLAB_trainingData)
vAR_CSLAB_model.predictOnValues(vAR_CSLAB_testData.map(vAR_CSLAB_lp => (vAR_CSLAB_lp.label, vAR_CSLAB_lp.features))).print()
ssc.start()
ssc.awaitTermination()
}
}
//StreamingKMeansExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STREAMING_LINEAR_REGRESSION_V1
Purpose : A Program for Streaming Linear Regression in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 11:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Streaming Linear Regression in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.StreamingLinearRegressionWithSGD
import org.apache.spark.streaming._
object StreamingLinearRegressionExample {
def main(args: Array[String]): Unit = {
if (args.length != 2) {
System.err.println("Usage: StreamingLinearRegressionExample <trainingDir> <testDir>")
System.exit(1)
}
val vAR_CSLAB_conf = new SparkConf().setAppName("StreamingLinearRegressionExample")
val ssc = new StreamingContext(vAR_CSLAB_conf, Seconds(1))
val vAR_CSLAB_trainingData = ssc.textFileStream(args(0)).map(LabeledPoint.parse).cache()
val vAR_CSLAB_testData = ssc.textFileStream(args(1)).map(LabeledPoint.parse)
val vAR_CSLAB_numFeatures = 3
val vAR_CSLAB_model = new StreamingLinearRegressionWithSGD()
.setInitialWeights(Vectors.zeros(vAR_CSLAB_numFeatures))
vAR_CSLAB_model.trainOn(vAR_CSLAB_trainingData)
vAR_CSLAB_model.predictOnValues(vAR_CSLAB_testData.map(vAR_CSLAB_lp => (vAR_CSLAB_lp.label, vAR_CSLAB_lp.features))).print()
ssc.start()
ssc.awaitTermination()
//ssc.stop()
}
}
//StreamingLinearRegressionExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STREAMING_LOGISTIC_REGRESSION_V1
Purpose : A Program for Streaming Logistic Regression in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 11:58 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Streaming Logistic Regression in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.mllib.classification.StreamingLogisticRegressionWithSGD
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.streaming.{Seconds, StreamingContext}
object StreamingLogisticRegression {
def main(args: Array[String]) {
if (args.length != 4) {
System.err.println(
"Usage: StreamingLogisticRegression <trainingDir> <testDir> <batchDuration> <numFeatures>")
System.exit(1)
}
val vAR_CSLAB_conf = new SparkConf().setMaster("local").setAppName("StreamingLogisticRegression")
val ssc = new StreamingContext(vAR_CSLAB_conf, Seconds(args(2).toLong))
val vAR_CSLAB_trainingData = ssc.textFileStream(args(0)).map(LabeledPoint.parse)
val vAR_CSLAB_testData = ssc.textFileStream(args(1)).map(LabeledPoint.parse)
val vAR_CSLAB_model = new StreamingLogisticRegressionWithSGD()
.setInitialWeights(Vectors.zeros(args(3).toInt))
vAR_CSLAB_model.trainOn(vAR_CSLAB_trainingData)
vAR_CSLAB_model.predictOnValues(vAR_CSLAB_testData.map(vAR_CSLAB_lp => (vAR_CSLAB_lp.label, vAR_CSLAB_lp.features))).print()
ssc.start()
ssc.awaitTermination()
}
}
//StreamingLogisticRegression.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SUMMARY_STATISTICS_V1
Purpose : A Program for Summary Statistics in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 12:29 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Summary Statistics in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.stat.{MultivariateStatisticalSummary, Statistics}
object SummaryStatisticsExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("SummaryStatisticsExample")
val vAR_CSLAB_observations = sc.parallelize(
Seq(
Vectors.dense(1.0, 10.0, 100.0),
Vectors.dense(2.0, 20.0, 200.0),
Vectors.dense(3.0, 30.0, 300.0)
)
)
// Compute column summary statistics.
val vAR_CSLAB_summary: MultivariateStatisticalSummary = Statistics.colStats(vAR_CSLAB_observations)
println(vAR_CSLAB_summary.mean) // a dense vector containing the mean value for each column
println(vAR_CSLAB_summary.variance) // column-wise variance
println(vAR_CSLAB_summary.numNonzeros) // number of nonzeros in each column
}
}
SummaryStatisticsExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SINGLE_VALUE_DECOMPOSITION_V1
Purpose : A Program for Single Value Decomposition in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 13:13 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Single Value Decomposition in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.linalg.Matrix
import org.apache.spark.mllib.linalg.SingularValueDecomposition
import org.apache.spark.mllib.linalg.Vector
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.linalg.distributed.RowMatrix
object SVDExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("SVDExample")
val vAR_CSLAB_data = Array(
Vectors.sparse(5, Seq((1, 1.0), (3, 7.0))),
Vectors.dense(2.0, 0.0, 3.0, 4.0, 5.0),
Vectors.dense(4.0, 0.0, 0.0, 6.0, 7.0))
val vAR_CSLAB_rows = sc.parallelize(vAR_CSLAB_data)
val vAR_CSLAB_mat: RowMatrix = new RowMatrix(vAR_CSLAB_rows)
// Compute the top 5 singular values and corresponding singular vectors.
val vAR_CSLAB_svd: SingularValueDecomposition[RowMatrix, Matrix] = vAR_CSLAB_mat.computeSVD(5, computeU = true)
val vAR_CSLAB_U: RowMatrix = vAR_CSLAB_svd.U // The U factor is a RowMatrix.
val vAR_CSLAB_s: Vector = vAR_CSLAB_svd.s // The singular values are stored in a local dense vector.
val vAR_CSLAB_V: Matrix = vAR_CSLAB_svd.V // The V factor is a local dense matrix.
// $example off$
val vAR_CSLAB_collect = vAR_CSLAB_U.rows.collect()
println("U factor is:")
vAR_CSLAB_collect.foreach { vAR_CSLAB_vector => println(vAR_CSLAB_vector) }
println(s"Singular values are: $vAR_CSLAB_s")
println(s"V factor is:\n$vAR_CSLAB_V")
}
}
SVDExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SUPPORT_VECTOR_MACHINES_WITH_SGD_V1
Purpose : A Program for Support Vector Machines With SGD in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 14:04 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Support Vector Machines With SGD in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.classification.{SVMModel, SVMWithSGD}
import org.apache.spark.mllib.evaluation.BinaryClassificationMetrics
import org.apache.spark.mllib.util.MLUtils
object SVMWithSGDExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("SVMWithSGDExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load training data in LIBSVM format.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split data into training (60%) and test (40%).
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// Run training algorithm to build the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_model = SVMWithSGD.train(vAR_CSLAB_training, vAR_CSLAB_numIterations)
// Clear the default threshold.
vAR_CSLAB_model.clearThreshold()
// Compute raw scores on the test set.
val vAR_CSLAB_scoreAndLabels = vAR_CSLAB_test.map { vAR_CSLAB_point =>
val vAR_CSLAB_score = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_score, vAR_CSLAB_point.label)
}
// Get evaluation metrics.
val vAR_CSLAB_metrics = new BinaryClassificationMetrics(vAR_CSLAB_scoreAndLabels)
val vAR_CSLAB_auROC = vAR_CSLAB_metrics.areaUnderROC()
println(s"Area under ROC = $vAR_CSLAB_auROC")
// Save and load model
vAR_CSLAB_model.save(sc, "scalaSVMWithSGDModel")
val vAR_CSLAB_sameModel = SVMModel.load(sc, "scalaSVMWithSGDModel")
}
}
SVMWithSGDExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_TAILSKINNY_PCA_V1
Purpose : A Program for Tail Skinny PCA in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 14:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Tail Skinny PCA in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.linalg.distributed.RowMatrix
object TallSkinnyPCA {
def main(args: Array[String]) {
if (args.length != 1) {
System.err.println("Usage: TallSkinnyPCA <input>")
System.exit(1)
}
val vAR_CSLAB_conf = new SparkConf().setAppName("TallSkinnyPCA")
// Load and parse the data file.
val vAR_CSLAB_rows = sc.textFile(args(0)).map { vAR_CSLAB_line =>
val vAR_CSLAB_values = vAR_CSLAB_line.split(' ').map(_.toDouble)
Vectors.dense(vAR_CSLAB_values)
}
val vAR_CSLAB_mat = new RowMatrix(vAR_CSLAB_rows)
// Compute principal components.
val vAR_CSLAB_pc = vAR_CSLAB_mat.computePrincipalComponents(vAR_CSLAB_mat.numCols().toInt)
println(s"Principal components are:\n $vAR_CSLAB_pc")
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_TAILSKINNY_SVD_V1
Purpose : A Program for Tail Skinny SVD in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 15:55 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Tail Skinny SVD in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.linalg.distributed.RowMatrix
object TallSkinnySVD {
def main(args: Array[String]) {
if (args.length != 1) {
System.err.println("Usage: TallSkinnySVD <input>")
System.exit(1)
}
val vAR_CSLAB_conf = new SparkConf().setAppName("TallSkinnySVD")
// Load and parse the data file.
val vAR_CSLAB_rows = sc.textFile(args(0)).map { vAR_CSLAB_line =>
val vAR_CSLAB_values = vAR_CSLAB_line.split(' ').map(_.toDouble)
Vectors.dense(vAR_CSLAB_values)
}
val vAR_CSLAB_mat = new RowMatrix(vAR_CSLAB_rows)
// Compute SVD.
val vAR_CSLAB_svd = vAR_CSLAB_mat.computeSVD(vAR_CSLAB_mat.numCols().toInt)
println(s"Singular values are ${vAR_CSLAB_svd.s}")
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_TFIDF_VECTORIZER_V1
Purpose : A Program for Tfidf Vectorizer in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 16:21 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Tfidf Vectorizer in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.{HashingTF, IDF}
import org.apache.spark.mllib.linalg.Vector
import org.apache.spark.rdd.RDD
object TFIDFExample {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("TFIDFExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "kmeans_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load documents (one per line).
val vAR_CSLAB_documents: RDD[Seq[String]] = sc.textFile(vAR_CSLAB_FILE_PATH)
.map(_.split(" ").toSeq)
val vAR_CSLAB_hashingTF = new HashingTF()
val vAR_CSLAB_tf: RDD[Vector] = vAR_CSLAB_hashingTF.transform(vAR_CSLAB_documents)
// While applying HashingTF only needs a single pass to the data, applying IDF needs two passes:
// First to compute the IDF vector and second to scale the term frequencies by IDF.
vAR_CSLAB_tf.cache()
val vAR_CSLAB_idf = new IDF().fit(vAR_CSLAB_tf)
val vAR_CSLAB_tfidf: RDD[Vector] = vAR_CSLAB_idf.transform(vAR_CSLAB_tf)
// spark.mllib IDF implementation provides an option for ignoring terms which occur in less than
// a minimum number of documents. In such cases, the IDF for these terms is set to 0.
// This feature can be used by passing the minDocFreq value to the IDF constructor.
val vAR_CSLAB_idfIgnore = new IDF(minDocFreq = 2).fit(vAR_CSLAB_tf)
val vAR_CSLAB_tfidfIgnore: RDD[Vector] = vAR_CSLAB_idfIgnore.transform(vAR_CSLAB_tf)
println("tfidf: ")
vAR_CSLAB_tfidf.foreach(vAR_CSLAB_x => println(vAR_CSLAB_x))
println("tfidfIgnore: ")
vAR_CSLAB_tfidfIgnore.foreach(vAR_CSLAB_x => println(vAR_CSLAB_x))
}
}
TFIDFExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_WORD2VEC_VECTORIZER_V1
Purpose : A Program for Word2Vec Vectorizer in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 11/02/2019 16:53 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Word2Vec Vectorizer in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.mllib.feature.{Word2Vec, Word2VecModel}
object Word2VecExample {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("Word2VecExample")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_lda_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_input = sc.textFile(vAR_CSLAB_FILE_PATH).map(line => line.split(" ").toSeq)
val vAR_CSLAB_word2vec = new Word2Vec()
val vAR_CSLAB_model = vAR_CSLAB_word2vec.fit(vAR_CSLAB_input)
val vAR_CSLAB_synonyms = vAR_CSLAB_model.findSynonyms("1", 5)
for((vAR_CSLAB_synonym, cosineSimilarity) <- vAR_CSLAB_synonyms) {
println(s"$vAR_CSLAB_synonym $cosineSimilarity")
}
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath12")
//val vAR_CSLAB_sameModel = Word2VecModel.load(sc, "myModelPath")
}
}
Word2VecExample.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_AREA_SPLINE_PLOT_V1
Purpose : A Program for Area Spline Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 9:22 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Area Spline Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object AreaSplineChart {
def main(args: Array[String]): Unit = {
import com.quantifind.charts.Highcharts._
areaspline(List(1, 2, 3, 4, 5), List(4, 1, 3, 2, 6))
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PIE_PLOT_V1
Purpose : A Program for Pie Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 09:38 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Pie Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object PieChart {
def main(args: Array[String]): Unit = {
import com.quantifind.charts.Highcharts._
pie(Seq(4, 4, 5, 9))
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_REGRESSION_PLOT_V1
Purpose : A Program for Regression Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 09:56 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Regression Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object RegressionPlot {
def main(args: Array[String]): Unit = {
import com.quantifind.charts.Highcharts._
regression((0 until 100).map(x => -x + scala.util.Random.nextInt(25)))
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STACKED_BAR_PLOT_V1
Purpose : A Program for Stacked Bar Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 10:09 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Stacked Bar Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object StackedBarPlot {
def main(args: Array[String]): Unit = {
import com.quantifind.charts.Highcharts._
bar((0 until 20).map(_ % 8))
hold
bar((0 until 20).map(_ % 4))
stack()
title("Stacked Bars")
xAxis("Quantity")
yAxis("Price")
legend(List("Blue", "Black"))
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SCATTER_PLOT_V1
Purpose : A Program for Scatter Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 10:22 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Scatter Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object ScatterPlot {
def main(args: Array[String]): Unit = {
import org.sameersingh.scalaplot.Implicits._
val x = 0.0 until 2.0 * math.Pi by 0.1
xyChart(x ->(math.sin(_), math.cos(_)))
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_LINE_PLOT_V1
Purpose : A Program for Line Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 10:37 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Line Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object LinePlot {
def main(args: Array[String]): Unit = {
import com.quantifind.charts.Highcharts._
line((1 to 10), (1 to 10))
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_BAR_PLOT_V1
Purpose : A Program for Bar Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 10:53 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Bar Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object BarPlot {
def main(args: Array[String]): Unit = {
import com.quantifind.charts.Highcharts._
bar(0 to 40)
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_HORIZONTAL_BAR_PLOT_V1
Purpose : A Program for Horizontal Bar Plot in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 15/02/2019 11:08 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Horizontal Bar Plot in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object HorizontalBarPlot {
def main(args: Array[String]): Unit = {
import com.quantifind.charts.Highcharts._
bar(10 to 0 by -1)
}
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_USER_PRODUCT_RECOMMENDATION_V1
Purpose : A Program for User Product Recommendation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 9:21 hrs
Version : 1.0
/*****************************
## Program Description : A Program for User Product Recommendation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.recommendation.ALS
import org.apache.spark.mllib.recommendation.MatrixFactorizationModel
import org.apache.spark.mllib.recommendation.Rating
object userproductRecommendation {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("userproductRecommendation")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "als/test.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_ratings = vAR_CSLAB_data.map(_.split(',') match { case Array(user, item, rate) =>
Rating(user.toInt, item.toInt, rate.toDouble)
})
// Build the recommendation model using ALS
val vAR_CSLAB_rank = 10
val vAR_CSLAB_numIterations = 10
val vAR_CSLAB_model = ALS.train(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, 0.01)
// Evaluate the model on rating data
val vAR_CSLAB_usersProducts = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
(user, product)
}
val vAR_CSLAB_predictions =
vAR_CSLAB_model.predict(vAR_CSLAB_usersProducts).map { case Rating(user, product, rate) =>
((user, product), rate)
}
val vAR_CSLAB_ratesAndPreds = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
((user, product), rate)
}.join(vAR_CSLAB_predictions)
val vAR_CSLAB_MSE = vAR_CSLAB_ratesAndPreds.map { case ((user, product), (r1, r2)) =>
val vAR_CSLAB_err = (r1 - r2)
vAR_CSLAB_err * vAR_CSLAB_err
}.mean()
//println(s"Mean Squared Error = $vAR_CSLAB_MSE")
println(s"Predicted Product Recommendation = $vAR_CSLAB_predictions")
// Save and load model
vAR_CSLAB_model.save(sc, "userproductRecommendation")
val vAR_CSLAB_sameModel = MatrixFactorizationModel.load(sc, "userproductRecommendation")
}
}
userproductRecommendation.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CUSTOMER_CHURN_ANALYSIS_V1
Purpose : A Program for Customer Churn Analysis in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 9:34 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Customer Churn Analysis in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
object CustomerChurnAnalysis {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("CustomerChurnAnalysis")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(vAR_CSLAB_r => vAR_CSLAB_r._1 != vAR_CSLAB_r._2).count.toDouble / vAR_CSLAB_testData.count()
println(s"Test Error = $vAR_CSLAB_testErr")
println(s"Learned classification forest model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "CustomerChurnAnalysis")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "CustomerChurnAnalysis")
}
}
CustomerChurnAnalysis.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CUSTOMER_CROSS_SELLING_ANALYSIS_V1
Purpose : A Program for Customer Cross Selling in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 10:03 hrs
Version : 1.0
/*****************************
## Program Description : A Program for A Program for Customer Cross Selling in Scala in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.rdd.RDD
import org.apache.spark.mllib.fpm.FPGrowth
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_fpgrowth.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_transactions: RDD[Array[String]] = vAR_CSLAB_data.map(s => s.trim.split(' '))
val vAR_CSLAB_fpg = new FPGrowth()
.setMinSupport(0.2)
.setNumPartitions(10)
val vAR_CSLAB_model = vAR_CSLAB_fpg.run(vAR_CSLAB_transactions)
vAR_CSLAB_model.freqItemsets.collect().foreach { vAR_CSLAB_itemset =>
println(vAR_CSLAB_itemset.items.mkString("[", ",", "]") + ", " + vAR_CSLAB_itemset.freq)
}
val vAR_CSLAB_minConfidence = 0.8
vAR_CSLAB_model.generateAssociationRules(vAR_CSLAB_minConfidence).collect().foreach { vAR_CSLAB_rule =>
println(
vAR_CSLAB_rule.antecedent.mkString("[", ",", "]")
+ " => " + vAR_CSLAB_rule.consequent .mkString("[", ",", "]")
+ ", " + vAR_CSLAB_rule.confidence)
}
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CUSTOMER_MARKET_SHARE_V1
Purpose : Code for Customer Market Share in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 10:34 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Customer Market Share Analysis in Scala in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
@deprecated("Use ml.regression.LinearRegression or LBFGS", "2.0.0")
object CustomerMarketShareAnalysis {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("CustomerMarketShareAnalysis")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { vAR_CSLAB_line =>
val vAR_CSLAB_parts = vAR_CSLAB_line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println(s"training Mean Squared Error $vAR_CSLAB_MSE")
// Save and load model
vAR_CSLAB_model.save(sc, "CustomerMarketShareAnalysis")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "CustomerMarketShareAnalysis")
}
}
CustomerMarketShareAnalysis.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CUSTOMER_UPSELLING_CHARACTERISTICS_PREDICTION_V1
Purpose : A Program for Customer Upselling Characteristics Prediction in Scala in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 11:01 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Customer Upselling Characteristics Prediction in Scala in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.fpm.PrefixSpan
object CustomerUpsellingCharacteristics {
def main(args: Array[String]) {
val vAR_CSLAB_conf = new SparkConf().setAppName("CustomerUpsellingCharacteristics")
val vAR_CSLAB_sequences = sc.parallelize(Seq(
Array(Array(1, 2), Array(3)),
Array(Array(1), Array(3, 2), Array(1, 2)),
Array(Array(1, 2), Array(5)),
Array(Array(6))
), 2).cache()
val vAR_CSLAB_prefixSpan = new PrefixSpan()
.setMinSupport(0.5)
.setMaxPatternLength(5)
val vAR_CSLAB_model = vAR_CSLAB_prefixSpan.run(vAR_CSLAB_sequences)
vAR_CSLAB_model.freqSequences.collect().foreach { vAR_CSLAB_freqSequence =>
println(
vAR_CSLAB_freqSequence.sequence.map(_.mkString("[", ", ", "]")).mkString("[", ", ", "]") +
", " + vAR_CSLAB_freqSequence.freq)
}
}
}
CustomerUpsellingCharacteristics.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_READMITTANCE_RATE_ANALYSIS_V1
Purpose : Code for Patient Readmittance Rate Analysis in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 11:29 hrs hrs
Version : 1.0
/*****************************
## Program Description : A Program for Patient Readmittance Rate Analysis in Scala in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "PatientReadmissionRateAnalysis")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "PatientReadmissionRateAnalysis")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_SURVIVAL_ANALYSIS_V1
Purpose : A Program for Patient Survival Analysis in Scala in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 11:54 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Patient Survival Analysis in Scala in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.DecisionTree
import org.apache.spark.mllib.tree.model.DecisionTreeModel
import org.apache.spark.mllib.util.MLUtils
object PatientSurvivalAnaysis {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("PatientSurvivalAnaysis")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a DecisionTree model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 5
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = DecisionTree.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count().toDouble / vAR_CSLAB_testData.count()
println(s"Test Error = $vAR_CSLAB_testErr")
println(s"Learned classification tree model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "PatientSurvivalAnaysis")
val vAR_CSLAB_sameModel = DecisionTreeModel.load(sc, "PatientSurvivalAnaysis")
}
}
PatientSurvivalAnaysis.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_EMPLOYEE_ATTRITION_ANALYSIS_V1
Purpose : A Program for Employee Attrition in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 12:27 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Employee Attrition in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "EmployeeAttritionAnalysis")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "EmployeeAttritionAnalysis")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CUSTOMER_SEGMENTATION_V1
Purpose : A Program for Customer Segmentation by Product in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 12:56 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Customer Segmentation by Product in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.clustering.{KMeans, KMeansModel}
import org.apache.spark.mllib.linalg.Vectors
object CustomerSegmentation {
def main(args: Array[String]) {
val vAR_CSLAB_conf = new SparkConf().setAppName("CustomerSegmentation")
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "kmeans_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
// Cluster the data into two classes using KMeans
val vAR_CSLAB_numClusters = 2
val vAR_CSLAB_numIterations = 20
val vAR_CSLAB_clusters = KMeans.train(vAR_CSLAB_parsedData, vAR_CSLAB_numClusters, vAR_CSLAB_numIterations)
// Evaluate clustering by computing Within Set Sum of Squared Errors
val vAR_CSLAB_WSSSE = vAR_CSLAB_clusters.computeCost(vAR_CSLAB_parsedData)
println(s"Within Set Sum of Squared Errors = $vAR_CSLAB_WSSSE")
// Save and load model
vAR_CSLAB_clusters.save(sc, "CustomerSegmentation")
val sameModel = KMeansModel.load(sc, "CustomerSegmentation")
}
}
CustomerSegmentation.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PRODUCT_MARKET_BASKET_ANALYSIS_V1
Purpose : A Program for Product Basket Analysis by Product in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 13:45 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Product Basket Analysis by Product in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.clustering.{KMeans, KMeansModel}
import org.apache.spark.mllib.linalg.Vectors
object CustomerSegmentation {
def main(args: Array[String]) {
val vAR_CSLAB_conf = new SparkConf().setAppName("CustomerSegmentation")
//val sc = new SparkContext(vAR_CSLAB_conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "kmeans_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
// Cluster the data into two classes using KMeans
val vAR_CSLAB_numClusters = 2
val vAR_CSLAB_numIterations = 20
val vAR_CSLAB_clusters = KMeans.train(vAR_CSLAB_parsedData, vAR_CSLAB_numClusters, vAR_CSLAB_numIterations)
// Evaluate clustering by computing Within Set Sum of Squared Errors
val vAR_CSLAB_WSSSE = vAR_CSLAB_clusters.computeCost(vAR_CSLAB_parsedData)
println(s"Within Set Sum of Squared Errors = $vAR_CSLAB_WSSSE")
// Save and load model
vAR_CSLAB_clusters.save(sc, "target/org/apache/spark/KMeansExample/CustomerSegmentation")
val sameModel = KMeansModel.load(sc, "target/org/apache/spark/KMeansExample/CustomerSegmentation")
sc.stop()
}
}
CustomerSegmentation.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SALES_GROWTH_ANALYSIS_V1
Purpose : A Program for Product Sales Analysis by Product in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 14:12 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Product Sales Analysis by Product in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "ProductSalesGrowth")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "ProductSalesGrowth")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_HOSPITAL_LENGTH_OF_STAY_AT_THE_HOSPITAL_V1
Purpose : A Program for Prediction of Patient's Length of Stay at the Hospital in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 14:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Prediction of Patient's Length of Stay at the Hospital in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "PatientLOS")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "PatientLOS")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_HEALTH_RISK_PREDICTION_V1
Purpose : A Program for Prediction of Patient's Health Risk in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 15:18 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Prediction of Patient's Health Risk in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.classification.{LogisticRegressionModel, LogisticRegressionWithLBFGS}
import org.apache.spark.mllib.evaluation.MulticlassMetrics
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load training data in LIBSVM format.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split data into training (60%) and test (40%).
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// Run training algorithm to build the model
val vAR_CSLAB_model = new LogisticRegressionWithLBFGS().setNumClasses(10).run(vAR_CSLAB_training)
// Compute raw scores on the test set.
val vAR_CSLAB_predictionAndLabels = vAR_CSLAB_test.map { case LabeledPoint(label, features) =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(features)
(vAR_CSLAB_prediction, label)
}
// Get evaluation metrics.
val vAR_CSLAB_metrics = new MulticlassMetrics(vAR_CSLAB_predictionAndLabels)
val vAR_CSLAB_accuracy = vAR_CSLAB_metrics.accuracy
println(s"Accuracy = $vAR_CSLAB_accuracy")
// Save and load model
vAR_CSLAB_model.save(sc, "PatientHealthRisk")
val vAR_CSLAB_sameModel = LogisticRegressionModel.load(sc,"PatientHealthRisk")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_MEDICATION_ADHERENCE_PREDICTION_V1
Purpose : A Program for Prediction of Patient Medication Adherence in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 15:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Prediction of Patient Medication Adherence in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.DecisionTree
import org.apache.spark.mllib.tree.model.DecisionTreeModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a DecisionTree model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 5
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = DecisionTree.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count.toDouble / vAR_CSLAB_testData.count()
println("Test Error = " + vAR_CSLAB_testErr)
println("Learned classification tree model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "PatientMedicationAdherence")
val vAR_CSLAB_sameModel = DecisionTreeModel.load(sc, "PatientMedicationAdherence")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_WAIT_TIME_AT_HOSPITAL_PREDICTION_V1
Purpose : A Program for Prediction of Patient Wait Time at the Hospital in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 16:15 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Prediction of Patient Wait Time at the Hospital in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count.toDouble / vAR_CSLAB_testData.count()
println("Test Error = " + vAR_CSLAB_testErr)
println("Learned classification forest model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "PatientHospitalWaittime")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "PatientHospitalWaittime")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_VOLUME_AT_HOSPITAL_PREDICTION_V1
Purpose : A Program for Prediction of Patient Volume at the Hospital in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 16:39 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Prediction of Patient Wait Time at the Hospital in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.GradientBoostedTrees
import org.apache.spark.mllib.tree.configuration.BoostingStrategy
import org.apache.spark.mllib.tree.model.GradientBoostedTreesModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a GradientBoostedTrees model.
// The defaultParams for Regression use SquaredError by default.
val vAR_CSLAB_boostingStrategy = BoostingStrategy.defaultParams("Regression")
vAR_CSLAB_boostingStrategy.numIterations = 3 // Note: Use more iterations in practice.
vAR_CSLAB_boostingStrategy.treeStrategy.maxDepth = 5
// Empty categoricalFeaturesInfo indicates all features are continuous.
vAR_CSLAB_boostingStrategy.treeStrategy.categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_model = GradientBoostedTrees.train(vAR_CSLAB_trainingData, vAR_CSLAB_boostingStrategy)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelsAndPredictions = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testMSE = vAR_CSLAB_labelsAndPredictions.map{ case(v, p) => math.pow((v - p), 2)}.mean()
println("Test Mean Squared Error = " + vAR_CSLAB_testMSE)
println("Learned regression GBT model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "PatientVolumeattheHospital")
val vAR_CSLAB_sameModel = GradientBoostedTreesModel.load(sc, "PatientVolumeattheHospital")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_HOSPITAL_REVENUE_PREDICTION_V1
Purpose : A Program for Hospital Revenue Prediction in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 17:14 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Hospital Revenue Prediction in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "HospitalRevenue")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "HospitalRevenue")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PHYSICIAN_RECOMMENDATION_V1
Purpose : A Program for Physician Recommendation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 17:39 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Physician Recommendation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.recommendation.ALS
import org.apache.spark.mllib.recommendation.MatrixFactorizationModel
import org.apache.spark.mllib.recommendation.Rating
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "/als/test.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_ratings = vAR_CSLAB_data.map(_.split(',') match { case Array(user, item, rate) =>
Rating(user.toInt, item.toInt, rate.toDouble)
})
// Build the recommendation model using ALS
val vAR_CSLAB_rank = 10
val vAR_CSLAB_numIterations = 10
val vAR_CSLAB_model = ALS.train(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, 0.01)
// Evaluate the model on rating data
val vAR_CSLAB_usersProducts = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
(user, product)
}
val vAR_CSLAB_predictions =
vAR_CSLAB_model.predict(vAR_CSLAB_usersProducts).map { case Rating(user, product, rate) =>
((user, product), rate)
}
val vAR_CSLAB_ratesAndPreds = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
((user, product), rate)
}.join(vAR_CSLAB_predictions)
val vAR_CSLAB_MSE = vAR_CSLAB_ratesAndPreds.map { case ((user, product), (r1, r2)) =>
val vAR_CSLAB_err = (r1 - r2)
vAR_CSLAB_err * vAR_CSLAB_err
}.mean()
println("Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "myModelPath7")
val vAR_CSLAB_sameModel = MatrixFactorizationModel.load(sc, "myModelPath7")
//If the rating matrix is derived from another source of information (e.g., it is inferred from other signals), you can use the trainImplicit method to get better results.
val vAR_CSLAB_alpha = 0.01
val vAR_CSLAB_lambda = 0.01
val vAR_CSLAB_model1 = ALS.trainImplicit(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, vAR_CSLAB_lambda, vAR_CSLAB_alpha)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_WITH_SIMILAR_HEALTH_RISK_CLUSTERING_V1
Purpose : A Program for Grouping Patients with Similar Health Risk in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 18:05 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Physician Recommendation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.clustering.{PowerIterationClustering, PowerIterationClusteringModel}
import org.apache.spark.mllib.linalg.Vectors
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "pic_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_similarities = vAR_CSLAB_data.map { vAR_CSLAB_line =>
val vAR_CSLAB_parts = vAR_CSLAB_line.split(' ')
(vAR_CSLAB_parts(0).toLong, vAR_CSLAB_parts(1).toLong, vAR_CSLAB_parts(2).toDouble)
}
// Cluster the data into two classes using PowerIterationClustering
val vAR_CSLAB_pic = new PowerIterationClustering()
.setK(2)
.setMaxIterations(10)
val vAR_CSLAB_model = vAR_CSLAB_pic.run(vAR_CSLAB_similarities)
vAR_CSLAB_model.assignments.foreach { vAR_CSLAB_a =>
println(s"${vAR_CSLAB_a.id} -> ${vAR_CSLAB_a.cluster}")
}
// Save and load model
vAR_CSLAB_model.save(sc, "SimilarPatients")
val vAR_CSLAB_sameModel = PowerIterationClusteringModel.load(sc, "SimilarPatients")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_DISEASE_DAIGNOSIS_MULTICLASS_CLASSIFICATION_V1
Purpose : A Program for Patients Disease Daignosis in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 18:31 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Patients Disease Daignosis in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.evaluation.MultilabelMetrics
import org.apache.spark.rdd.RDD;
val vAR_CSLAB_scoreAndLabels: RDD[(Array[Double], Array[Double])] = sc.parallelize(
Seq((Array(0.0, 1.0), Array(0.0, 2.0)),
(Array(0.0, 2.0), Array(0.0, 1.0)),
(Array(), Array(0.0)),
(Array(2.0), Array(2.0)),
(Array(2.0, 0.0), Array(2.0, 0.0)),
(Array(0.0, 1.0, 2.0), Array(0.0, 1.0)),
(Array(1.0), Array(1.0, 2.0))), 2)
// Instantiate metrics object
val vAR_CSLAB_metrics = new MultilabelMetrics(vAR_CSLAB_scoreAndLabels)
// Summary stats
println(s"Recall = ${vAR_CSLAB_metrics.recall}")
println(s"Precision = ${vAR_CSLAB_metrics.precision}")
println(s"F1 measure = ${vAR_CSLAB_metrics.f1Measure}")
println(s"Accuracy = ${vAR_CSLAB_metrics.accuracy}")
// Individual label stats
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label precision = ${vAR_CSLAB_metrics.precision(label)}"))
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label recall = ${vAR_CSLAB_metrics.recall(label)}"))
vAR_CSLAB_metrics.labels.foreach(label => println(s"Class $label F1-score = ${vAR_CSLAB_metrics.f1Measure(label)}"))
// Micro stats
println(s"Micro recall = ${vAR_CSLAB_metrics.microRecall}")
println(s"Micro precision = ${vAR_CSLAB_metrics.microPrecision}")
println(s"Micro F1 measure = ${vAR_CSLAB_metrics.microF1Measure}")
// Hamming loss
println(s"Hamming loss = ${vAR_CSLAB_metrics.hammingLoss}")
// Subset accuracy
println(s"Subset accuracy = ${vAR_CSLAB_metrics.subsetAccuracy}")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_MORTALITY_PREDICTION_V1
Purpose : A Program for Patient Mortality Prediction in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 19:03 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Patient Mortality Prediction in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.DecisionTree
import org.apache.spark.mllib.tree.model.DecisionTreeModel
import org.apache.spark.mllib.util.MLUtils
object PatientMortalityPrediction {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("PatientMortalityPrediction")
//val sc = new SparkContext(vAR_CSLAB_conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a DecisionTree model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 5
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = DecisionTree.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count().toDouble / vAR_CSLAB_testData.count()
println(s"Test Error = $vAR_CSLAB_testErr")
println(s"Learned classification tree model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "PatientMortalityPrediction")
val vAR_CSLAB_sameModel = DecisionTreeModel.load(sc, "PatientMortalityPrediction")
sc.stop()
}
}
PatientMortalityPrediction.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_INPATIENT_CLAIMS_PREDICTION_V1
Purpose : A Program for In-Patient Claims Prediction in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 19:38 hrs
Version : 1.0
/*****************************
## Program Description : A Program for In-Patient Claims Prediction in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "gini"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainClassifier(vAR_CSLAB_trainingData, vAR_CSLAB_numClasses, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelAndPreds = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testErr = vAR_CSLAB_labelAndPreds.filter(r => r._1 != r._2).count.toDouble / vAR_CSLAB_testData.count()
println("Test Error = " + vAR_CSLAB_testErr)
println("Learned classification forest model:\n" + vAR_CSLAB_model.toDebugString)
// Save and load model
vAR_CSLAB_model.save(sc, "InpatientClaims")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "InpatientClaims")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_HOSPITAL_READMISSION_PREDICTION_V1
Purpose : A Program for Hospital Readmission Prediction in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 20:07 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Hospital Readmission Prediction in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "PatientReadmission")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "PatientReadmission")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_QUALITY_OF_CARE_PREDICTION_V1
Purpose : A Program for Patient Quality of Care Prediction in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 20:37 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Patient Quality of Care Prediction in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.tree.RandomForest
import org.apache.spark.mllib.tree.model.RandomForestModel
import org.apache.spark.mllib.util.MLUtils
object PatientQualityofCare {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("PatientQualityofCare")
//val sc = new SparkContext(vAR_CSLAB_conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data file.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split the data into training and test sets (30% held out for testing)
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.7, 0.3))
val (vAR_CSLAB_trainingData, vAR_CSLAB_testData) = (vAR_CSLAB_splits(0), vAR_CSLAB_splits(1))
// Train a RandomForest model.
// Empty categoricalFeaturesInfo indicates all features are continuous.
val vAR_CSLAB_numClasses = 2
val vAR_CSLAB_categoricalFeaturesInfo = Map[Int, Int]()
val vAR_CSLAB_numTrees = 3 // Use more in practice.
val vAR_CSLAB_featureSubsetStrategy = "auto" // Let the algorithm choose.
val vAR_CSLAB_impurity = "variance"
val vAR_CSLAB_maxDepth = 4
val vAR_CSLAB_maxBins = 32
val vAR_CSLAB_model = RandomForest.trainRegressor(vAR_CSLAB_trainingData, vAR_CSLAB_categoricalFeaturesInfo,
vAR_CSLAB_numTrees, vAR_CSLAB_featureSubsetStrategy, vAR_CSLAB_impurity, vAR_CSLAB_maxDepth, vAR_CSLAB_maxBins)
// Evaluate model on test instances and compute test error
val vAR_CSLAB_labelsAndPredictions = vAR_CSLAB_testData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_testMSE = vAR_CSLAB_labelsAndPredictions.map{ case(v, p) => math.pow((v - p), 2)}.mean()
println(s"Test Mean Squared Error = $vAR_CSLAB_testMSE")
println(s"Learned regression forest model:\n ${vAR_CSLAB_model.toDebugString}")
// Save and load model
vAR_CSLAB_model.save(sc, "PatientQualityofCare")
val vAR_CSLAB_sameModel = RandomForestModel.load(sc, "PatientQualityofCare")
sc.stop()
}
}
PatientQualityofCare.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PATIENT_MEDICATION_RECOMMENDATION_V1
Purpose : A Program for Patient Medication Recommendation in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 21:04 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Patient Medication Recommendation in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.recommendation.ALS
import org.apache.spark.mllib.recommendation.MatrixFactorizationModel
import org.apache.spark.mllib.recommendation.Rating
object PatientMedicationRecommendation {
def main(args: Array[String]): Unit = {
val vAR_CSLAB_conf = new SparkConf().setAppName("PatientMedicationRecommendation")
//val sc = new SparkContext(vAR_CSLAB_conf)
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "als/test.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_ratings = vAR_CSLAB_data.map(_.split(',') match { case Array(user, item, rate) =>
Rating(user.toInt, item.toInt, rate.toDouble)
})
// Build the recommendation model using ALS
val vAR_CSLAB_rank = 10
val vAR_CSLAB_numIterations = 10
val vAR_CSLAB_model = ALS.train(vAR_CSLAB_ratings, vAR_CSLAB_rank, vAR_CSLAB_numIterations, 0.01)
// Evaluate the model on rating data
val vAR_CSLAB_usersProducts = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
(user, product)
}
val vAR_CSLAB_predictions =
vAR_CSLAB_model.predict(vAR_CSLAB_usersProducts).map { case Rating(user, product, rate) =>
((user, product), rate)
}
val vAR_CSLAB_ratesAndPreds = vAR_CSLAB_ratings.map { case Rating(user, product, rate) =>
((user, product), rate)
}.join(vAR_CSLAB_predictions)
val vAR_CSLAB_MSE = vAR_CSLAB_ratesAndPreds.map { case ((user, product), (r1, r2)) =>
val vAR_CSLAB_err = (r1 - r2)
vAR_CSLAB_err * vAR_CSLAB_err
}.mean()
println(s"Mean Squared Error = $vAR_CSLAB_MSE")
// Save and load model
vAR_CSLAB_model.save(sc, "PatientMedicationRecommendation")
val vAR_CSLAB_sameModel = MatrixFactorizationModel.load(sc, "PatientMedicationRecommendation")
//sc.stop()
}
}
PatientMedicationRecommendation.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_HOUSE_PRICE_PREDICTION_V1
Purpose : A Program for House Price Predicion in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 21:32 hrs
Version : 1.0
/*****************************
## Program Description : A Program for House Price Predicion in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_conf = new SparkConf().setAppName("HousePricePrediction")
//val sc = new SparkContext(vAR_CSLAB_conf)
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "HousePricePrediction")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "HousePricePrediction")
//sc.stop()
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PUMP_FAILURE_PREDICTION_V1
Purpose : A Program for Pump Failure Predicion in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 22:00 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Pump Failure Predicion in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.mllib.classification.{LogisticRegressionModel, LogisticRegressionWithLBFGS}
import org.apache.spark.mllib.evaluation.MulticlassMetrics
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_conf = new SparkConf().setAppName("PumpFailure")
//val sc = new SparkContext(vAR_CSLAB_conf)
// Load training data in LIBSVM format.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split data into training (60%) and test (40%).
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// Run training algorithm to build the model
val vAR_CSLAB_model = new LogisticRegressionWithLBFGS().setNumClasses(10).run(vAR_CSLAB_training)
// Compute raw scores on the test set.
val vAR_CSLAB_predictionAndLabels = vAR_CSLAB_test.map { case LabeledPoint(label, features) =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(features)
(vAR_CSLAB_prediction, label)
}
// Get evaluation metrics.
val vAR_CSLAB_metrics = new MulticlassMetrics(vAR_CSLAB_predictionAndLabels)
val vAR_CSLAB_accuracy = vAR_CSLAB_metrics.accuracy
println(s"Accuracy = $vAR_CSLAB_accuracy")
// Save and load model
vAR_CSLAB_model.save(sc, "PumpFailure")
val vAR_CSLAB_sameModel = LogisticRegressionModel.load(sc,"PumpFailure")
//sc.stop()
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_OIL_WELL_STATUS_PREDICTION_V1
Purpose : A Program for Oil Well Status Predicion in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 22:28 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Oil Well Status Predicion in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.classification.{LogisticRegressionModel, LogisticRegressionWithLBFGS}
import org.apache.spark.mllib.evaluation.MulticlassMetrics
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.util.MLUtils
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_libsvm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load training data in LIBSVM format.
val vAR_CSLAB_data = MLUtils.loadLibSVMFile(sc, vAR_CSLAB_FILE_PATH)
// Split data into training (60%) and test (40%).
val vAR_CSLAB_splits = vAR_CSLAB_data.randomSplit(Array(0.6, 0.4), seed = 11L)
val vAR_CSLAB_training = vAR_CSLAB_splits(0).cache()
val vAR_CSLAB_test = vAR_CSLAB_splits(1)
// Run training algorithm to build the model
val vAR_CSLAB_model = new LogisticRegressionWithLBFGS().setNumClasses(10).run(vAR_CSLAB_training)
// Compute raw scores on the test set.
val vAR_CSLAB_predictionAndLabels = vAR_CSLAB_test.map { case LabeledPoint(label, features) =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(features)
(vAR_CSLAB_prediction, label)
}
// Get evaluation metrics.
val vAR_CSLAB_metrics = new MulticlassMetrics(vAR_CSLAB_predictionAndLabels)
val vAR_CSLAB_accuracy = vAR_CSLAB_metrics.accuracy
println(s"Accuracy = $vAR_CSLAB_accuracy")
// Save and load model
vAR_CSLAB_model.save(sc, "OilWellStatus")
val vAR_CSLAB_sameModel = LogisticRegressionModel.load(sc,"OilWellStatus")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_OIL_WELL_REVENUE_PREDICTION_V1
Purpose : A Program for Oil Well Revenue Predicion in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 12/02/2019 22:47 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Oil Well Revenue Predicion in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import org.apache.spark.mllib.linalg.Vectors
import org.apache.spark.mllib.regression.LabeledPoint
import org.apache.spark.mllib.regression.LinearRegressionModel
import org.apache.spark.mllib.regression.LinearRegressionWithSGD
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "ridge-data/lpsa.data";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
// Load and parse the data
val vAR_CSLAB_data = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_parsedData = vAR_CSLAB_data.map { line =>
val vAR_CSLAB_parts = line.split(',')
LabeledPoint(vAR_CSLAB_parts(0).toDouble, Vectors.dense(vAR_CSLAB_parts(1).split(' ').map(_.toDouble)))
}.cache()
// Building the model
val vAR_CSLAB_numIterations = 100
val vAR_CSLAB_stepSize = 0.00000001
val vAR_CSLAB_model = LinearRegressionWithSGD.train(vAR_CSLAB_parsedData, vAR_CSLAB_numIterations, vAR_CSLAB_stepSize)
// Evaluate model on training examples and compute training error
val vAR_CSLAB_valuesAndPreds = vAR_CSLAB_parsedData.map { vAR_CSLAB_point =>
val vAR_CSLAB_prediction = vAR_CSLAB_model.predict(vAR_CSLAB_point.features)
(vAR_CSLAB_point.label, vAR_CSLAB_prediction)
}
val vAR_CSLAB_MSE = vAR_CSLAB_valuesAndPreds.map{ case(v, p) => math.pow((v - p), 2) }.mean()
println("training Mean Squared Error = " + vAR_CSLAB_MSE)
// Save and load model
vAR_CSLAB_model.save(sc, "OilWellRevenuePrediction")
val vAR_CSLAB_sameModel = LinearRegressionModel.load(sc, "OilWellRevenuePrediction")
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_COLLECTION_FROM_CSV_FILE_IN_SCALA_V1
Purpose : A Program for Data Collection from a CSV File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 9:45 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Data Collection from a CSV File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Unit2_Program78_Read.csv";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
for(vAR_CSLAB_line <- Source.fromFile(vAR_CSLAB_FILE_PATH).getLines())
println(vAR_CSLAB_line)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_COLLECTION_FROM__EXCEL_IN_SCALA_V1
Purpose : A Program for Data Collection from an Excel File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 09:31 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Data Collection from an Excel File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import java.io._
object Test {
def main(args: Array[String]) {
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Unit2_Program80_Read.xlsx";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_writer = new PrintWriter(new File(vAR_CSLAB_FILE_PATH))
vAR_CSLAB_writer.write("Hello Scala")
vAR_CSLAB_writer.close()
}
}
Test.main(Array(" "))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_COLLECTION_FROM_FROM_URL_IN_SCALA_V1
Purpose : A Program for Data Collection from an URL in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 06/02/2019 10:49 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Data Collection from an URL in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
val vAR_CSLAB_holmesUrl = "http://www.gutenberg.org/cache/epub/1661/pg1661.txt"
for (line <- Source.fromURL(vAR_CSLAB_holmesUrl).getLines)
println(line)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_COLLECTION_FROM_TEXT_FILE_IN_SCALA_V1
Purpose : A Program for Data Collection from a Text File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 09:58 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Data Collection from a Text File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.io.Source
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
for(vAR_CSLAB_line <- Source.fromFile(vAR_CSLAB_FILE_PATH).getLines())
println(vAR_CSLAB_line)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_COLLECTION_FROM_XML_FILE_IN_SCALA_V1
Purpose : A Program for Data Collection from XML File in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 10:17 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Data Collection from XML File in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.xml.XML
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH1")
var vAR_CSLAB_DATA_FILE = "Unit2_Program81_Read_XML.xml";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_xml = XML.loadFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_temp = (vAR_CSLAB_xml \\ "channel" \\ "item" \ "condition" \ "@temp") text
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_INTERGRATION_CONCATENATE_TWO_STRINGS_V1
Purpose : A Program for Concatenating Two Strings in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 10:31 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Concatenating Two Strings in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object ConcatenateStrings {
def main(args: Array[String]) {
var vAR_CSLAB_str1 = "Hello!!! ";
var vAR_CSLAB_str2 = "Iam Scala";
println(vAR_CSLAB_str1.concat(vAR_CSLAB_str2));
}
}
ConcatenateStrings.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_INTERGRATION_INTERSECT_TWO_ARRAYS_V1
Purpose : A Program for Intersecting Two Arrays in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 10:44 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Intersecting Two Arrays in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_a = Array(1,2,3,4,5)
val vAR_CSLAB_b = Array(4,5,6,7,8)
val vAR_CSLAB_c = vAR_CSLAB_a.intersect(vAR_CSLAB_b)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_INTERGRATION_UNION_TWO_RDD_V1
Purpose : A Program for Union of Two RDD's in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 10:57 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Union of Two RDD's in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_rdd1 = sc.parallelize(Seq((1, "Aug", 30),(1, "Sep", 31),(2, "Aug", 15),(2, "Sep", 10)))
val vAR_CSLAB_rdd2 = sc.parallelize(Seq((1, "Oct", 10),(1, "Nov", 12),(2, "Oct", 5),(2, "Nov", 15)))
vAR_CSLAB_rdd1.union(vAR_CSLAB_rdd2).collect
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_INTERGRATION_JOIN_TWO_RDD_V1
Purpose : A Program for Join of Two RDD's in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 11:08 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Join of Two RDD's in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_rdd1 = sc.parallelize(Seq(("math", 55),("math", 56),("english", 57),("science", 54)))
val vAR_CSLAB_rdd2 = sc.parallelize(Seq(("math", 60),("math", 65),("science", 61),("history", 64)))
val vAR_CSLAB_Join = vAR_CSLAB_rdd1.join(vAR_CSLAB_rdd2)
vAR_CSLAB_Join.collect()
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_INTERGRATION_LEFT_OUTER_JOIN_TWO_RDD_V1
Purpose : A Program for Left Outer Join of Two RDD's in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 11:19 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Left Outer Join of Two RDD's in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_rdd1 = sc.parallelize(Seq(("math", 55),("math", 56),("english", 57),("science", 54)))
val vAR_CSLAB_rdd2 = sc.parallelize(Seq(("math", 60),("math", 65),("science", 61),("history", 64)))
val vAR_CSLAB_Left_Outer_Join = vAR_CSLAB_rdd1.leftOuterJoin(vAR_CSLAB_rdd2)
vAR_CSLAB_Left_Outer_Join.collect()
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_INTERGRATION_RIGHT_OUTER_JOIN_TWO_RDD_V1
Purpose : A Program for Right Outer Join of Two RDD's in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 11:31 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Right Outer Join of Two RDD's in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_rdd1 = sc.parallelize(Seq(("math", 55),("math", 56),("english", 57),("science", 54)))
val vAR_CSLAB_rdd2 = sc.parallelize(Seq(("math", 60),("math", 65),("science", 61),("history", 64)))
val vAR_CSLAB_Right_Outer_Join = vAR_CSLAB_rdd1.rightOuterJoin(vAR_CSLAB_rdd2)
vAR_CSLAB_Right_Outer_Join.collect()
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_MAPPING_GROUPBYKEY_V1
Purpose : A Program for Groupbykey Function in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 11:44 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Right Outer Join of Two RDD's in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_numbers = List(1,5,1,6,5,2,1,9,2,1)
val vAR_CSLAB_group = vAR_CSLAB_numbers.groupBy(vAR_CSLAB_x => vAR_CSLAB_x)
val vAR_CSLAB_group2 = vAR_CSLAB_numbers.groupBy(vAR_CSLAB_x => vAR_CSLAB_x+1)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_MAPPING_REDUCEBYKEY_V1
Purpose : A Program for Reducebykey Function in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 11:59 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Reducebykey Function in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
var vAR_CSLAB_ENV_PATH = sys.env("SCALA_TUTORIAL_PATH2")
var vAR_CSLAB_DATA_FILE = "sample_svm_data.txt";
var vAR_CSLAB_FILE_PATH = vAR_CSLAB_ENV_PATH + vAR_CSLAB_DATA_FILE
val vAR_CSLAB_lines = sc.textFile(vAR_CSLAB_FILE_PATH)
val vAR_CSLAB_pairs = vAR_CSLAB_lines.map(s => (s, 1))
val vAR_CSLAB_counts = vAR_CSLAB_pairs.reduceByKey((a, b) => a + b)
vAR_CSLAB_counts.collect()
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_MAPPING_MAPVALUES_V1
Purpose : A Program for Mapvalues Function in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 12:13 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Mapvalues Function in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_m = Map( "a" -> 2, "b" -> 3 )
vAR_CSLAB_m.mapValues(_ * 5)
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_MAPPING_KEYS_V1
Purpose : A Program for Keys Function in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 12:27 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Keys Function in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_data = scala.collection.mutable.Map[String, String]("A" -> "1", "Bb" -> "aaa")
val vAR_CSLAB_newData = vAR_CSLAB_data.map { case (key, value) => key.toLowerCase -> value }
vAR_CSLAB_data.foreach { case (key, value) =>
vAR_CSLAB_data -= key
vAR_CSLAB_data += key.toLowerCase -> value
}
vAR_CSLAB_data
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_DATA_MAPPING_COUNTBYKEY_V1
Purpose : A Program for CountbyKey Function in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 12:41 hrs
Version : 1.0
/*****************************
## Program Description : A Program for CountbyKey Function in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
val vAR_CSLAB_rdd = sc.parallelize(Seq(("math",55),("math",56),("english",57),("english",58),("science",59),("science",54)))
vAR_CSLAB_rdd.countByKey()
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_ABSTRACT_CLASS_V1
Purpose : A Program for Abstract Class in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 12:53 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Abstract Class in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
abstract class Bike{
def run()
}
class Hero extends Bike{
def run(){
println("running fine...")
}
}
object MainObject{
def main(args: Array[String]){
var vAR_CSLAB_h = new Hero()
vAR_CSLAB_h.run()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_TRAITS_V1
Purpose : A Program for Traits in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 13:38 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Traits in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
trait Printable{
def print()
}
trait Showable{
def show()
}
class A6 extends Printable with Showable{
def print(){
println("This is printable")
}
def show(){
println("This is showable");
}
}
object MainObject{
def main(args:Array[String]){
var a = new A6()
a.print()
a.show()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_TRAIT_MIXIN_ORDERS_V1
Purpose : A Program for Trait Mixin Orders in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 13:49 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Trait Mixin Orders in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
trait Print{
def print()
}
abstract class PrintA4{
def printA4()
}
class A6 extends PrintA4 with Print{ // First one is abstract class second one is trait
def print(){ // Trait print
println("print sheet")
}
def printA4(){ // Abstract class printA4
println("Print A4 Sheet")
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_a = new A6()
vAR_CSLAB_a.print()
vAR_CSLAB_a.printA4()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STRING_METHODS_EQUALS_V1
Purpose : A Program for String Equals Method in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 14:02 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for String Equals Method in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class StringExample{
var vAR_CSLAB_s1 = "Scala string example"
var vAR_CSLAB_s2 = "Hello Scala"
var vAR_CSLAB_s3 = "Hello Scala"
def show(){
println(vAR_CSLAB_s1.equals(vAR_CSLAB_s2))
println(vAR_CSLAB_s2.equals(vAR_CSLAB_s3))
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_s = new StringExample()
vAR_CSLAB_s.show()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STRING_METHODS_COMPARETO_V1
Purpose : A Program for String CompareTo Method in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 14:16 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for String CompareTo Method in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class StringExample{
var vAR_CSLAB_s1 = "Scala string example"
var vAR_CSLAB_s2 = "Hello Scala"
var vAR_CSLAB_s3 = "Hello Scala"
def show(){
println(vAR_CSLAB_s1.compareTo(vAR_CSLAB_s2))
println(vAR_CSLAB_s2.compareTo(vAR_CSLAB_s3))
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_s = new StringExample()
vAR_CSLAB_s.show()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STRING_METHODS_CONCAT_V1
Purpose : A Program for String Concat Method in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 14:25 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for String Concat Method in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class StringExample{
var vAR_CSLAB_s1 = "This is "
var vAR_CSLAB_s2 = "Scala string example"
def show(){
println(vAR_CSLAB_s1.concat(vAR_CSLAB_s2))
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_s = new StringExample()
vAR_CSLAB_s.show()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_STRING_METHODS_SUBSTRING_V1
Purpose : A Program for String Substring Method in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 14:37 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for String Substring Method in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class StringExample3{
var vAR_CSLAB_s1 = "Scala string example"
def show(){
println(vAR_CSLAB_s1.substring(0,5))
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_s = new StringExample3()
vAR_CSLAB_s.show()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_PRIMARY_CONSTRUCTORS_V1
Purpose : A Program for Primary Constructors in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 14:49 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Primary Constructors in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Student(id:Int, name:String){
def showDetails(){
println(id+" "+name);
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_s = new Student(101,"Rama");
vAR_CSLAB_s.showDetails()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SECONDARY_CONSTRUCTORS_V1
Purpose : A Program for Secondary Constructors in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 15:03 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Primary Constructors in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Student(id:Int, name:String){
var vAR_CSLAB_age:Int = 0
def showDetails(){
println(id+" "+name+" "+vAR_CSLAB_age)
}
def this(id:Int, name:String,age:Int){
this(id,name) // Calling primary constructor, and it is first line
this.vAR_CSLAB_age = vAR_CSLAB_age
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_s = new Student(101,"Rama",20);
vAR_CSLAB_s.showDetails()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CONSTRUCTOR_OVERLOADING_V1
Purpose : A Program for Constructor Overloading in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 15:09 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Constructor Overloading in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Student(id:Int){
def this(vAR_CSLAB_id:Int, name:String)={
this(vAR_CSLAB_id)
println(vAR_CSLAB_id+" "+name)
}
println(id)
}
object MainObject{
def main(args:Array[String]){
new Student(101)
new Student(100,"India")
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SINGLETON_OBJECT_V1
Purpose : A Program for Singleton Object in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 15:22 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Singleton Object in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object Singleton{
def main(args:Array[String]){
SingletonObject.hello() // No need to create object.
}
}
object SingletonObject{
def hello(){
println("Hello, This is Singleton Object")
}
}
SingletonObject.hello
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_SINGLETON_COMPANION_OBJECT_V1
Purpose : A Program for Singleton Companion Object in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 15:37 hrs
Version : 1.0 b
/*****************************
## Program Description : A Program for Singleton Companion Object in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class CompanionClass{
def hello(){
println("Hello, this is Companion Class.")
}
}
object CompanionObject{
def main(args:Array[String]){
//new ComapanionClass().hello()
println("And this is Companion Object.")
}
}
CompanionObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CASE_CLASS_V1
Purpose : A Program for Case Class in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 15:49 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Case Class in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
case class CaseClass(a:Int, b:Int)
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_c = CaseClass(10,10) // Creating object of case class
println("vAR_CSLAB_a = "+vAR_CSLAB_c.a) // Accessing elements of case class
println("vAR_CSLAB_b = "+vAR_CSLAB_c.b)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CASE_CLASS_PATTERN_MATCHING_V1
Purpose : A Program for Case Class & Pattern Matching in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 16:01 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Case Class & Pattern Matching in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
trait SuperTrait
case class CaseClass1(a:Int,b:Int) extends SuperTrait
case class CaseClass2(a:Int) extends SuperTrait // Case class
case object CaseObject extends SuperTrait // Case object
object MainObject{
def main(args:Array[String]){
callCase(CaseClass1(10,10))
callCase(CaseClass2(10))
callCase(CaseObject)
}
def callCase(f:SuperTrait) = f match{
case CaseClass1(f,g)=>println("a = "+f+" b ="+g)
case CaseClass2(f)=>println("a = "+f)
case CaseObject=>println("No Argument")
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_METHOD_OVERLOADING_V1
Purpose : A Program for Method Overloading in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 16:14 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Method Overloading in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Arithmetic{
def add(vAR_CSLAB_a:Int, vAR_CSLAB_b:Int){
var vAR_CSLAB_sum = vAR_CSLAB_a+vAR_CSLAB_b
println(vAR_CSLAB_sum)
}
def add(vAR_CSLAB_a:Int, vAR_CSLAB_b:Int, vAR_CSLAB_c:Int){
var vAR_CSLAB_sum = vAR_CSLAB_a+vAR_CSLAB_b+vAR_CSLAB_c
println(vAR_CSLAB_sum)
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_a = new Arithmetic();
vAR_CSLAB_a.add(10,10);
vAR_CSLAB_a.add(10,10,10);
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_THIS_KEYWORD_V1
Purpose : A Program for This Keyword in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 16:27 hrs
Version : 1.0
/*****************************
## Program Description : A Program for This Keyword in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class ThisExample{
var vAR_CSLAB_id:Int = 0
var vAR_CSLAB_name: String = ""
def this(vAR_CSLAB_id:Int, vAR_CSLAB_name:String){
this()
this.vAR_CSLAB_id = vAR_CSLAB_id
this.vAR_CSLAB_name = vAR_CSLAB_name
}
def show(){
println(vAR_CSLAB_id+" "+vAR_CSLAB_name)
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_t = new ThisExample(101,"Martin")
vAR_CSLAB_t.show()
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CALLING_CONSTRUCTOR_USING_THIS_KEYWORD_V1
Purpose : A Program for Calling Constructor Using This Keyword in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 16:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Calling Constructor Using This Keyword in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class Student(vAR_CSLAB_name:String){
def this(vAR_CSLAB_name:String, vAR_CSLAB_age:Int){
this(vAR_CSLAB_name)
println(vAR_CSLAB_name+" "+vAR_CSLAB_age)
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_s = new Student("Rama",100)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_THROW_KEYWORD_V1
Purpose : A Program for Throw Keyword in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 16:54 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Throw Keyword in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class ExceptionExample2{
def validate(vAR_CSLAB_age:Int)={
if(vAR_CSLAB_age<18)
throw new ArithmeticException("You are not eligible")
else println("You are eligible")
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_e = new ExceptionExample2()
vAR_CSLAB_e.validate(10)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_THROWS_KEYWORD_V1
Purpose : A Program for Throws Keyword in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 17:06 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Throws Keyword in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class ExceptionExample4{
@throws(classOf[NumberFormatException])
def validate()={
"abc".toInt
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_e = new ExceptionExample4()
try{
vAR_CSLAB_e.validate()
}catch{
case ex : NumberFormatException => println("Exception handeled here")
}
println("Rest of the code executing...")
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_CUSTOM_EXCEPTION_V1
Purpose : A Program for Custom Exception in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 17:11 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Custom Exception in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class InvalidAgeException(s:String) extends Exception(s){}
class ExceptionExample{
@throws(classOf[InvalidAgeException])
def validate(vAR_CSLAB_age:Int){
if(vAR_CSLAB_age<18){
throw new InvalidAgeException("Not eligible")
}else{
println("You are eligible")
}
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_e = new ExceptionExample()
try{
vAR_CSLAB_e.validate(5)
}catch{
case vAR_CSLAB_e : Exception => println("Exception Occured : "+vAR_CSLAB_e)
}
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_FINALLY_BLOCK_V1
Purpose : A Program for Finally Block in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 17:26 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Finally Block in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
class ExceptionExample{
def divide(vAR_CSLAB_a:Int, vAR_CSLAB_b:Int) = {
try{
vAR_CSLAB_a/vAR_CSLAB_b
var vAR_CSLAB_arr = Array(1,2)
vAR_CSLAB_arr(10)
}catch{
case vAR_CSLAB_e: ArithmeticException => println(vAR_CSLAB_e)
case vAR_CSLAB_ex: Exception =>println(vAR_CSLAB_ex)
case vAR_CSLAB_th: Throwable=>println("found a unknown exception"+vAR_CSLAB_th)
}
finally{
println("Finaly block always executes")
}
println("Rest of the code is executing...")
}
}
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_e = new ExceptionExample()
vAR_CSLAB_e.divide(100,10)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_HASHSET_V1
Purpose : A Program for Hashset in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 17:42 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Hashset in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.collection.immutable.HashSet
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_hashset = HashSet(4,2,8,0,6,3,45)
vAR_CSLAB_hashset.foreach((element:Int) => println(element+" "))
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_BITSET_V1
Purpose : A Program for Bitset in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 17:55 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Bitset in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.collection.immutable._
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_numbers = BitSet(1,5,8,6,9,0)
vAR_CSLAB_numbers.foreach((element:Int) => println(element))
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_LISTSET_V1
Purpose : A Program for Listset in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 18:09 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Listset in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.collection.immutable._
object MainObject{
def main(args:Array[String]){
var listset = ListSet(4,2,8,0,6,3,45)
listset.foreach((element:Int) => println(element+" "))
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_SEQUENCES_V1
Purpose : A Program for Sequences in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 18:23 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Sequences in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.collection.immutable._
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_seq:Seq[Int] = Seq(52,85,1,8,3,2,7)
vAR_CSLAB_seq.foreach((element:Int) => print(element+" "))
println("\nAccessing element by using index")
println(vAR_CSLAB_seq(2))
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_VECTORS_V1
Purpose : A Program for Vectors in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 18:39 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Vectors in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.collection.immutable._
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_vector:Vector[Int] = Vector(5,8,3,6,9,4) //Or
var vAR_CSLAB_vector2 = Vector(5,2,6,3)
var vAR_CSLAB_vector3 = Vector.empty
println(vAR_CSLAB_vector)
println(vAR_CSLAB_vector2)
println(vAR_CSLAB_vector3)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_QUEUES_V1
Purpose : A Program for Queues in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 18:53 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Queues in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.collection.immutable._
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_queue = Queue(1,5,6,2,3,9,5,2,5)
var vAR_CSLAB_queue2:Queue[Int] = Queue(1,5,6,2,3,9,5,2,5)
println(vAR_CSLAB_queue)
println(vAR_CSLAB_queue2)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_STREAMS_V1
Purpose : A Program for Streams in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 19:07 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Streams in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
object MainObject{
def main(args:Array[String]){
val vAR_CSLAB_stream = 100 #:: 200 #:: 85 #:: Stream.empty
println(vAR_CSLAB_stream)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************
/*****************************
File Name : CSLAB_COLLECTIONS_HASHMAP_V1
Purpose : A Program for Hashmaps in Scala
Author : DeepSphere.AI, Inc.
Date and Time : 13/02/2019 19:21 hrs
Version : 1.0
/*****************************
## Program Description : A Program for Hashmaps in Scala
## Scala Development Environment & Runtime - Eclipse IDE, Anaconda, Jupyter
import scala.collection.immutable._
object MainObject{
def main(args:Array[String]){
var vAR_CSLAB_hashMap = new HashMap()
var vAR_CSLAB_hashMap2 = HashMap("A"->"Apple","B"->"Ball","C"->"Cat")
println(vAR_CSLAB_hashMap)
println(vAR_CSLAB_hashMap2)
}
}
MainObject.main(Array(""))
/*****************************
Disclaimer:
We are providing this code block strictly for learning and researching. This is not a production-ready code. We assume no liability for this code under any circumstance. By using this code, users assume full risk.
All software, hardware, and other products that are referenced in these materials, belong to the respective vendor who developed or who owns this product.
/*****************************