Cause: org.apache.ibatis.executor.ExecutorException: Executor was closed

background mybatis is used only, but spring is not used because it involves multiple data sources, so each query uses the mapper factory method to create mappers to execute select or Update. Executor was closed when the same mapper executed select and executed update. Solve Log 2018-05-30 17:09:58,941 DEBUG [org.apache ...

Posted by JayFM on Wed, 06 Feb 2019 08:54:16 -0800

Linux+Solr+Zookeeper-01: Build Solr Service (non-cluster) and register it as Service Service Service.

[Super Connection: Linux+Solr+Zookeeper Series - Chapter 1] This article mainly explains how to build Solr service on Linux server and make it a Service service. The Solr service built in this paper is non-clustered (non-SolrCloud mode), and is only used for Solr's single service use. 1. environment Ubuntu 16.04 L ...

Posted by lostboy on Tue, 05 Feb 2019 12:27:16 -0800

Spring Cloud Initial Eureka-Consumer Service Consumption (Declarative Feign) (III)

Spring Cloud Feign is a declarative service invocation client based on Netflix Feign implementation. It makes it easier to write Web service clients. We only need to create an interface and configure it with annotations to bind to the Web service interface. It has pluggable annotation support, including Feign annotations and JAX-RS annotations. ...

Posted by cableuser on Mon, 04 Feb 2019 23:39:17 -0800

Mac nginx IOS push RTMP VLC play HTML 5 play

This is the push-flow on the mobile phone, and on the left is Apple's own browser. On the right is played using vlc Storage path of hls Talk about the whole implementation Combining Theory with Practice: Implementing a Simple Real-time Video Live Broadcasting Based on HTML5 The greatest contribution of this article is ...

Posted by brandon99999 on Mon, 04 Feb 2019 15:33:16 -0800

Running Principle of Spark Streaming

Sequence diagram 1. NetworkWordCount 2. Initialize StreamingContext 3. Create InputDStream 4. Start Job Scheduler 1. NetworkWordCount package yk.streaming import org.apache.spark.SparkConf import org.apache.spark.storage.StorageLevel import org.apache.spark.streaming.{Seconds, StreamingContext} object NetworkWordCount { ...

Posted by bsarika on Sun, 03 Feb 2019 15:03:16 -0800

Unable to find valid certification path to request target solution

Today, after the sale, the consulting customer bought the certificate. There was a 400 error in https request. Let me help you get it. Seeing that the customer's code seems to be no problem, I tried it myself, and there really is an exception. Record to avoid being pitted later. Scenario: Two-way validation of SL, requests nee ...

Posted by joel426 on Sun, 03 Feb 2019 13:51:15 -0800

Tomcat 8 cookie domain validation problem

The error code is as follows Type Exception Report Message An invalid domain [127.0.0.1:8080] was specified for this cookie Description The server encountered an unexpected condition that prevented it from fulfilling the request. Exception java.lang.IllegalArgumentException: An invalid domain [127.0.0.1:8080] was specified ...

Posted by Rollo Tamasi on Sun, 03 Feb 2019 01:51:15 -0800

Spark textfile reads HDFS file partitions [compressed and uncompressed]

Spark textfile reads HDFS file partitions [compressed and uncompressed] sc.textFile("/blabla/{*.gz}") When we create spark context and use textfile to read files, what partition is it based on? What is the partition size? Compressed format of files File size and HDFS block size textfile will create a Hadoop RDD that uses ...

Posted by MK27 on Sat, 02 Feb 2019 17:54:15 -0800

Installation of Snipe-IT Asset Management System in CentOS 7+Apache+PHP 7.2+Mariadb Environment

I. Environmental preparation CentOS 7 + Apache 2.4.6 + PHP +Mariadb5.5.60 Apache and Mariadb are installed directly by yum, and PHP is installed by binary source code.   Pre-installation preparation 1. System Update # Note Use Centos 7.5 for Minimizing Installation here yum -y install epel-release yum update -y 2. Inst ...

Posted by iceblox on Sat, 02 Feb 2019 16:27:15 -0800

I/O Operation of Hadoop: Serialization (I)

I/O Operation of Hadoop: Serialization (I) 1. serialization (1) Basic definitionsSerialization refers to the conversion of structured objects into byte streams for transmission over the network or permanent storage on disk; deserialization refers to the conversion of byte streams back to structured objects. (2) Applications ...

Posted by cedricm on Sat, 02 Feb 2019 14:36:16 -0800