from pyspark import SparkContext
from pyspark import SparkConf
conf=SparkConf().setAppName("lg").setMaster('local') #local means to run 4 kernels locally
1. Parallel and collect
The parallelize function converts the list obj ...
Posted by moomsdad on Fri, 21 Feb 2020 02:13:19 -0800
In the previous documents, we demonstrated how to use shared files and shared databases to realize the cluster of activemq. See also MasterSlave cluster deployment of ActiveMQ high availability solution (HA) (I)
In this section, we demonstrate how to implement clustering through leveldb + zookeeper.
Posted by gjdunga on Fri, 14 Feb 2020 05:31:00 -0800
Define a variable in the interpreter
val and var variables
Use type inference to define variables
Java variable definition
int a = 0;
In scala, you can use val or var to define variables. The syntax format is as follows: ...
Posted by thor erik on Sun, 09 Feb 2020 02:40:44 -0800
1, Maven compiler plugin
1. It is used to set the jdk version used when maven is packaged. maven is a java framework, so it is only for jdk. scala needs to set it separately.
2.1. Setting plug-ins
Posted by Rohan Shenoy on Sun, 09 Feb 2020 00:46:26 -0800
This topic is similar to some of the search topics in Leetcode.
The problem you want to deal with is: count the number of two adjacent digits of a word. If there are w1,w2,w3,w4,w5,w6, then:
The final output is (word,neighbor,frequency).
We implement it in five ways:
Spark SQL method
Spark SQL for Scala
Posted by olechka on Sun, 02 Feb 2020 08:18:59 -0800
1, Read data source
(1) Read json and use spark.read. Note: the path is from HDFS by default. If you want to read the native file, you need to prefix it file: / /, as follows
scala> val people = spark.read.format("json").load("file:///opt/software/data/people.json")
people: org.apache.spark.sql.DataFrame = [age: bigint, name: string]
Posted by Pie on Sun, 02 Feb 2020 08:18:33 -0800
2: Why Flink
3: What industries need
4: Features of Flink
5: The difference with sparkStreaming
6: Preliminary development
7: Flink configuration description
9: Running components
Flink is a framework and distributed com ...
Posted by stodge on Fri, 17 Jan 2020 01:18:24 -0800
Guide to functional programming in Scala
Scala functional programming (2) introduction to scala basic syntax
scala functional programming (3) scala sets and functions
Scala functional programming (four) functional data structure
1.List code analysis
The content introduced today is mainly to supplement the scala functional data st ...
Posted by stev979 on Thu, 19 Dec 2019 03:45:09 -0800
Novice learning, if there are mistakes, please correct, thank you!
1. Start zookeeper and kafka, and set a topic as test fkss. For the convenience of observation, I added it through kafka manager
2. Configure Flume and start it. The listening file is / home / czh / docker-public-file/testplume.log, which is sent to kafka
Posted by ksduded on Sat, 14 Dec 2019 10:51:38 -0800
Now let's look at the typical usage of templates.
Now declare a view / main.scala.html template as the main template:
@(title: String)(content: Html)
<section class="content">@content</section> ...
Posted by Tryweryn on Sun, 08 Dec 2019 07:13:13 -0800