C

Operational requirements Student Achievement Management (1)class Student { int id; string name; int score; Student next; } (2) Using linked list or system List class. Array implementation is also possible (up to 100 records)? (3) Increase: Check no duplication of student numbers (4) Delete: If an integer is input, the record to be deleted is l ...

Posted by gaza165 on Sat, 22 Jun 2019 16:19:48 -0700

spark examples

Spark Streaming is a quasi-real-time stream processing framework. The processing response time is usually in minutes, that is to say, the delay time of processing real-time data is in seconds; Storm is a real-time stream processing framework, and the processing response is in milliseconds. So the selection of flow framework depends on the spec ...

Posted by Frederick on Fri, 21 Jun 2019 14:37:32 -0700

Principle and basic configuration of master-slave replication of MySQL 5.7

Principle: Traditional mysql replication is master-slave replication. It has one master, one or more slaves. After submitting and executing things at the master node, it sends them (asynchronously) to the slave node through a log file for re-execution (in statement-based replication) or application (in line-based replication). By default, all ...

Posted by 5kyy8lu3 on Thu, 20 Jun 2019 19:19:27 -0700

Summary of mysqldumpslow usage

Summary of mysqldumpslow usage Origin: MySQL in the generated environment has the problem of slow query. In order to analyze slow query, open slow query log and analyze the log. In order to avoid misoperation in the production environment, copy the record of slow query log back to their own computer for analysis. The results are as follows: ...

Posted by CanWeb on Thu, 20 Jun 2019 19:06:22 -0700

MYSQL 1) Application Scenario of Temporary Table 2) merge Storage Engine Query Application Scenario after Tabulation

For practical work, let's sum up here. It's easy to forget. mark it: 1) Temporary table: Search the Internet and compare the temporary table with the memory table. The table structure and data of temporary tables are stored in memory, and session starts and ends its life cycle. The table structure of the memory table is stored in the databa ...

Posted by GESmithPhoto on Thu, 20 Jun 2019 18:19:14 -0700

Experiments - MySql transaction isolation level

Through Baidu search, "MySql transaction isolation level" and "InnoDB transaction isolation level" found many articles "characteristics" as follows: The key point is to explain: the difference between non-repeatable reading and hallucination reading; Most of the conclusions are: under the isolation level of repe ...

Posted by []InTeR[] on Thu, 20 Jun 2019 13:18:50 -0700

django database table building process and table structure

Catalog Configure the database Create a table structure Settings of multi-table gateways Self-Creating Association Table Method Joint use of self-built tables and ManyTo ManyField Configure the database In the settings.py file of the Django project, configure database connection inf ...

Posted by peppeto on Wed, 19 Jun 2019 11:36:13 -0700

Very good review material: How to write the SQL statement in the end?

The database used in this paper is as follows: CREATE DATABASE exam; /*Creating Department Tables*/ CREATE TABLE dept( deptno INT PRIMARY KEY, dname VARCHAR(50), loc VARCHAR(50) ); /*Create an Employee Table*/ CREATE TABLE emp( empno INT PRIMARY KEY, ename VARCHAR(50), job VARC ...

Posted by sobha on Mon, 17 Jun 2019 15:00:25 -0700

Experimental Testing of Adn_move_table Mechanism (1)

Reorg is mainly to rearrange the data to save unnecessary space use, at the same time, it can reduce the number of write buffers and improve performance. Reducing space occupancy can be achieved by reducing the high water level in surface space. In order to analyze various reorg cases, an experimental test was carried out in particular. 1. ...

Posted by Mad_Mike on Mon, 17 Jun 2019 14:31:29 -0700

Spark SQL Learning Notes

Spark SQL is a module for processing structured data in Spark. Unlike the underlying Park RDD API, the Spark SQL interface provides more information about the structure of data and the runtime of computing tasks. Spark SQL now has three different APIs: the SQL statement, the DataFrame API and the latest Dataset API.One use of Spark SQL is to ex ...

Posted by arunmj82 on Sun, 16 Jun 2019 17:27:38 -0700