Akka(34): Http:Unmarshalling,from Json
Unmarshalling is a process in Akka-http to convert data in a transportable format on the Internet into program high-level structured speech data, such as converting Json data into an instance of a custom type. According to the specific process, Json is first converted into transferable format data such as Message Entity, HttpRequest, HttpRepons ...
Posted by bb_sarkar on Sun, 06 Jan 2019 19:03:09 -0800
Akka (23): Stream: Custom defined stream processing stages
Overall, akka-stream is composed of three framework stream components: data source Source, flow node Flow and data flow end Sink. Among them: Source and Link are two independent endpoints of stream, while Flow may be composed of multiple channel nodes between stream Source and Link. Each node represents the transformation processing function of ...
Posted by pbsonawane on Sun, 06 Jan 2019 14:18:09 -0800
FunDA (12) - Demonstration: strong typed data sources
The main purpose of FunDA design is to solve the problem of missing data source interline swimming operations in batch operation tool libraries such as FRM (Functional Relation Mapping) such as Slick. The result set produced by FRM is a static set, which lacks dynamic update operation mode. FunDA's solution is to transform the static set genera ...
Posted by luddeb on Wed, 02 Jan 2019 16:18:08 -0800
FunDA (1) - Query Result Row: Strongly typed Query result row
One of the characteristics of FunDA is to provide row-by-row data operation support in the form of data streams. This function solves the problem that FRM such as Slick data operation is mainly based on SQL batch mode. In order to achieve safe and efficient data row operation, we must transform the Query result set generated by FRM into a stron ...
Posted by solus on Thu, 27 Dec 2018 19:27:07 -0800
Linux Installs Spark Cluster (CentOS 7 + Spark 2.1.1 + Hadoop 2.8.0)
1 Install Spark-dependent Scala
1.1 Download and Decompress Scala
1.2 Configuring environment variables
1.3 Verify Scala
2 Download and Decompress Spark
2.1 Download Spark Compression Packet
2.2 Decompression Spark
3 Spark-related configuration
3.1 Configuring environment variable ...
Posted by micklerlop on Sat, 22 Dec 2018 02:21:06 -0800
Akka (19): Stream: Composite Data Stream, Composite Common-Graph modular composition
Graph of akka-stream is an arithmetic scheme. It may represent a simple linear data flow graph such as Source/Flow/Sink, or it may be a composite flow graph composed of more basic flow graphs with relatively complex points, and the composite flow graph itself can be used as a component to compose a larger Graph. Because Graph is only a descript ...
Posted by brentech on Fri, 21 Dec 2018 16:27:05 -0800
FunDA (17) - Demonstration: Exceptions handling and Finalizers
As a tool library that can run safely, in order to ensure the security of occupied resources, support for exception handling and final clean-up is indispensable. FunDA's data stream, FDAPipeLine, generally starts by reading database data to form data sources. In order to ensure that every data source can be used safely, FunDA provides a post-pr ...
Posted by drakkon on Thu, 20 Dec 2018 13:24:04 -0800
PICE(2):JDBCStreaming - gRPC-JDBC Service
In an akka-cluster environment, JDBC databases are separated from other nodes in the cluster from the point of view of data invocation. This is because the JDBC database is not distributed and does not have the transparency of node location. Therefore, JDBC database server must provide data manipulation through service mode. In this scenario, t ...
Posted by cr55dan on Tue, 18 Dec 2018 02:21:05 -0800
PICE (1): Programming In Clustered Environment - Programming Mode in Cluster Environment
First of all, I declare that the so-called programming mode in the title is a process control programming mode that I personally consider in the cluster environment of cross-node (jvm). It is conceived purely according to actual needs and has no theoretical support. In May, I shared some ideas about programming mode in cluster environment on sc ...
Posted by tryin_to_learn on Sat, 15 Dec 2018 18:06:04 -0800
Install SDKMAN (The Software Development Kit Manager)
SDKMAN is a tool for managing parallel versions of multiple software development toolbox based on Unix system. It provides a simple command line interface (CLI) and API for installing, converting, removing and displaying installable lists. Its predecessor is the Groovy enVironment Manager, inspired by RVM and vbenv tools that are mainly used i ...
Posted by jkohns on Sat, 15 Dec 2018 17:21:03 -0800