Spring Cloud Initial Eureka-Consumer Service Consumption (Declarative Feign) (III)
Spring Cloud Feign is a declarative service invocation client based on Netflix Feign implementation. It makes it easier to write Web service clients. We only need to create an interface and configure it with annotations to bind to the Web service interface. It has pluggable annotation support, including Feign annotations and JAX-RS annotations. ...
Posted by cableuser on Mon, 04 Feb 2019 23:39:17 -0800
Mac nginx IOS push RTMP VLC play HTML 5 play
This is the push-flow on the mobile phone, and on the left is Apple's own browser.
On the right is played using vlc
Storage path of hls
Talk about the whole implementation
Combining Theory with Practice: Implementing a Simple Real-time Video Live Broadcasting Based on HTML5
The greatest contribution of this article is ...
Posted by brandon99999 on Mon, 04 Feb 2019 15:33:16 -0800
Running Principle of Spark Streaming
Sequence diagram
1. NetworkWordCount
2. Initialize StreamingContext
3. Create InputDStream
4. Start Job Scheduler
1. NetworkWordCount
package yk.streaming
import org.apache.spark.SparkConf
import org.apache.spark.storage.StorageLevel
import org.apache.spark.streaming.{Seconds, StreamingContext}
object NetworkWordCount { ...
Posted by bsarika on Sun, 03 Feb 2019 15:03:16 -0800
Unable to find valid certification path to request target solution
Today, after the sale, the consulting customer bought the certificate. There was a 400 error in https request. Let me help you get it. Seeing that the customer's code seems to be no problem, I tried it myself, and there really is an exception. Record to avoid being pitted later.
Scenario: Two-way validation of SL, requests nee ...
Posted by joel426 on Sun, 03 Feb 2019 13:51:15 -0800
Tomcat 8 cookie domain validation problem
The error code is as follows
Type Exception Report
Message An invalid domain [127.0.0.1:8080] was specified for this cookie
Description The server encountered an unexpected condition that prevented it from fulfilling the request.
Exception
java.lang.IllegalArgumentException: An invalid domain [127.0.0.1:8080] was specified ...
Posted by Rollo Tamasi on Sun, 03 Feb 2019 01:51:15 -0800
Spark textfile reads HDFS file partitions [compressed and uncompressed]
Spark textfile reads HDFS file partitions [compressed and uncompressed]
sc.textFile("/blabla/{*.gz}")
When we create spark context and use textfile to read files, what partition is it based on? What is the partition size?
Compressed format of files
File size and HDFS block size
textfile will create a Hadoop RDD that uses ...
Posted by MK27 on Sat, 02 Feb 2019 17:54:15 -0800
Installation of Snipe-IT Asset Management System in CentOS 7+Apache+PHP 7.2+Mariadb Environment
I. Environmental preparation
CentOS 7 + Apache 2.4.6 + PHP +Mariadb5.5.60
Apache and Mariadb are installed directly by yum, and PHP is installed by binary source code.
Pre-installation preparation
1. System Update # Note Use Centos 7.5 for Minimizing Installation here
yum -y install epel-release
yum update -y
2. Inst ...
Posted by iceblox on Sat, 02 Feb 2019 16:27:15 -0800
I/O Operation of Hadoop: Serialization (I)
I/O Operation of Hadoop: Serialization (I)
1. serialization
(1) Basic definitionsSerialization refers to the conversion of structured objects into byte streams for transmission over the network or permanent storage on disk; deserialization refers to the conversion of byte streams back to structured objects.
(2) Applications
...
Posted by cedricm on Sat, 02 Feb 2019 14:36:16 -0800
DBUtils for database operations
Summary
DBUtils is a practical tool for database operation in Java programming. It is compact, simple and practical.
DBUtils encapsulates the operation of JDBC, simplifies the operation of JDBC, and can write less code.
Introduction to Three Core Functions of DBUtils
API to provide operations on sql statements in Query Runner
ResultSetHan ...
Posted by Brentley_11 on Sat, 02 Feb 2019 14:12:16 -0800
Sqoop Incremental Import and Export and Job Operation Example
Incremental import
Incremental import append for incremental columns
# First import
[root@node222 ~]# /usr/local/sqoop-1.4.7/bin/sqoop import --connect jdbc:mysql://192.168.0.200:3306/sakila?useSSL=false --table actor --where "actor_id < 50" --username sakila -P --num-mappers 1 --target-dir /tmp/hive/sqoop/actor_all
...
18/10/1 ...
Posted by zak on Sat, 02 Feb 2019 12:21:16 -0800