Big data Flume custom type
1. Customize the Interceptor
1.1 case requirements
When Flume is used to collect the local logs of the server, different types of logs need to be sent to different analysis systems according to different log types.
1.2 demand analysis: Interceptor and Multiplexing ChannelSelector cases
In actual development, there may be many types of ...
Posted by nainil on Fri, 26 Nov 2021 08:12:12 -0800
scala -- format and usage of function + lazy lazy usage + difference between method and function + method calling method
1. Method
1.1 general
In actual development, we need to write a large number of logic codes, which is bound to involve repeated requirements. For example, for the maximum values of 10 and 20 and the maximum values of 11 and 22, the logic code for comparison needs to be written twice. If the logic code for comparison is put into the method, it ...
Posted by tony-kidsdirect on Thu, 25 Nov 2021 16:45:14 -0800
scala -- operator + bit operation + inverse code, complement code and original code
1. Arithmetic operator
1.1 introduction to operators
The symbol used to splice variables or constants is called: operator, and the formula connected by operators is called: expression. We often use it in practical development
For example:
10 + 3 is an expression, and the + sign is an operator
Note: in Scala, operators are not only operator ...
Posted by Scummy12 on Wed, 24 Nov 2021 12:37:04 -0800
Introduction to Scala -- data types
1. Output statements and semicolons
1.1 output statement
Mode 1: line feed output
format: println(Write the data you want to print to the console);
Mode 2: output without line feed
format: print(Write the data you want to print to the console);
Note: multiple values can be printed at the same time regardless of println() or print() ...
Posted by theresandy on Tue, 23 Nov 2021 15:02:59 -0800
Spark read csv file operation, option parameter explanation
import com.bean.Yyds1
import org.apache.spark.sql.SparkSession
object TestReadCSV {
def main(args: Array[String]): Unit = {
val spark = SparkSession.builder()
.appName("CSV Reader")
.master("local")
.getOrCreate()
/** * Parameters can be either strings or specific types, such as boolean
* delimiter Separat ...
Posted by George Botley on Sun, 21 Nov 2021 12:20:33 -0800
GitLab CI/CD automated build and release practice
Process introduction
CI/CD is a method to frequently deliver applications to customers by introducing automation in the application development phase. The core concepts of CI/CD are continuous integration, continuous delivery and continuous deployment. In this article, I will introduce the practice of automated build and release based on GitLa ...
Posted by SwarleyAUS on Sun, 21 Nov 2021 11:12:41 -0800
Big data development review scala
10,scala
10.1 introduction to scala
scala is a multi paradigm programming language running on the JVM. It supports both object-oriented and function oriented programming.
10.2. scala interpreter
To start the scala interpreter, you only need the following steps:
Press and hold the windows key + rJust enter scala
Execute: quit in the scal ...
Posted by cuvaibhav on Thu, 18 Nov 2021 06:30:09 -0800
Basic and advanced scala
Methods and functionsDescription of the distinction between the two Scala has methods and functions. There is little semantic difference between them. In most cases, they are regarded as equivalent. Scala method is a part of a class, and function is an object that can be assigned to a variable, that is, the function defined in the class is a me ...
Posted by Gecko24 on Fri, 12 Nov 2021 08:43:45 -0800
Spark common RDD operators for big data development
Spark common RDD operators for big data development
map
map passes in a piece of data and returns a piece of data Map is to perform function operations on the elements in the RDD one by one and map them to another RDD, Each data item in an RDD is transformed into a new element through the function mapping in the map. Input partition and o ...
Posted by axo on Tue, 09 Nov 2021 10:56:02 -0800
Scala express syntax | data structure
Scala express syntax (11) | data structure
map mapping function
The map function returns a new collection
Pass the elements of the collection through the functions in ()
//Create list
val list = List(1, 2, 3, 4, 5)
//Each element * 2
println(list.map(_ * 2))
flatmap mapping: flat is flattening, flattening and flattening mapping
flat ...
Posted by nashyboy on Thu, 04 Nov 2021 18:23:12 -0700