Vue+Express+Mysql Full Stack Initial Experience

Keywords: node.js Vue MySQL JSON npm

Preface

Original Address

Once, did you think about the future of a front-end engineer?Did you think of the word "Front End Architect" at this point, so a qualified front end architecture will only have a Front End OK?Of course not. You must have the full stack of abilities to expand your personal image, to get a promotion and pay raise, to marry White Rich and to reach the top of your life.

Recently, I was writing some backend projects and found that there was too much duplication of work, especially in the framework section. Then I emptied out the front and back shelves, mainly Vue, Express, Mysql for data storage. Of course, if you have other needs, you can switch directly to sqlite, postgres or mssql.

First Offer Project Source Address

project

The project uses todolist as, which simply implements the front-end and back-end CURD s.

Backend Technology Stack

Front End Technology Stack

Project structure

Look first at the project architecture, with client as the front-end structure and server as the back-end structure

|-- express-vue-web-slush
    |-- client
    |   |-- http.js   // axios request encapsulation
    |   |-- router.js  // vue-router
    |   |-- assets  // Static Resources
    |   |-- components  // Common Components
    |   |-- store  // store
    |   |-- styles // style
    |   |-- views // view
    |-- server
        |-- api    // controller api file
        |-- container  // ioc container
        |-- daos  // dao layer
        |-- initialize  // Project Initialization File
        |-- middleware  // middleware
        |-- models  // model Layer
        |-- services // service layer

Code introduction

Front-end code is much less, you can see at a glance the structure generated by vue-cli, the difference is that the code written by the front-end is written in the form of Vue Class, see the details Project preparation from react transition to vue development

Then here's a brief description of the back-end code.

Hot Update

The development environment essentials, which we use are nodemon, add nodemon.json to the project root directory:

{
  "ignore": [
    ".git",
    "node_modules/**/node_modules",
    "src/client"
  ]
}

ignoreignore node_modules Front End Code Folder src/client Of js Document changes, ignore Other than js File Change nodemon.json Will restart node Project.

For convenience, I wrote a script and started the front-end and back-end projects as follows:

import * as childProcess from 'child_process';

function run() {
  const client = childProcess.spawn('vue-cli-service', ['serve']);
  client.stdout.on('data', x => process.stdout.write(x));
  client.stderr.on('data', x => process.stderr.write(x));

  const server = childProcess.spawn('nodemon', ['--exec', 'npm run babel-server'], {
    env: Object.assign({
      NODE_ENV: 'development'
    }, process.env),
    silent: false
  });
  server.stdout.on('data', x => process.stdout.write(x));
  server.stderr.on('data', x => process.stderr.write(x));

  process.on('exit', () => {
    server.kill('SIGTERM');
    client.kill('SIGTERM');
  });
}
run();

The front end is started with the vue-cli-service command of vue-cli.

The backend starts by executing the babel-node command with nodemon.

The front-end and back-end projects are then started by the node subprocess, and we add script in package.json.

{
    "scripts": {
        "dev-env": "cross-env NODE_ENV=development",
        "babel-server": "npm run dev-env && babel-node --config-file ./server.babel.config.js -- ./src/server/main.js",
        "dev": "babel-node --config-file ./server.babel.config.js -- ./src/dev.js",
    }
}

server.babel.config.js is the backend bable compilation configuration.

Project Configuration

So-called project configuration means system configuration that has nothing to do with your business, such as your log monitoring configuration, database information configuration, etc.

First, create a new configuration file in the project, config.properties, like Mysql, which I use here:

[mysql]
host=127.0.0.1
port=3306
user=root
password=root
database=test

Before the project starts, we use properties Parse it, create new properties.js on our server/initialize, and parse the configuration file:

import properties from 'properties';
import path from 'path';

const propertiesPath = path.resolve(process.cwd(), 'config.properties');

export default function load() {
  return new Promise((resolve, reject) => {
    properties.parse(propertiesPath, { path: true, sections: true }, (err, obj) => {
      if (err) {
        reject(err);
        return;
      }
      resolve(obj);
    });
  }).catch(e => {
    console.error(e);
    return {};
  });
}

Then before the project starts, initialize mysql and create a new file index.js in the server/initialize folder

import loadProperties from './properties';
import { initSequelize } from './sequelize';
import container from '../container';
import * as awilix from 'awilix';
import { installModel } from '../models';

export default async function initialize() {
  const config = await loadProperties();
  const { mysql } = config;
  const sequelize = initSequelize(mysql);
  installModel(sequelize);
  container.register({
    globalConfig: awilix.asValue(config),
    sequelize: awilix.asValue(sequelize)
  });
}

Here we have sequelize for data persistence, awilix for injection, described below.

After initializing all the configurations, we perform initialize before the project starts, as follows:

import express from 'express';
import initialize from './initialize';
import fs from 'fs';

const app = express();

export default async function run() {
  await initialize(app);

  app.get('*', (req, res) => {
    const html = fs.readFileSync(path.resolve(__dirname, '../client', 'index.html'), 'utf-8');
    res.send(html);
  });

  app.listen(9001, err => {
    if (err) {
      console.error(err);
      return;
    }
    console.log('Listening at http://localhost:9001');
  });
}

run();

Data persistence

As a front-end, there is no concept about the term data persistence. Here is a brief introduction. First, the data is divided into two states, one is transient state, the other is persistent state. Transient state data generally exists in memory, and there is no permanently saved data. Once our server hangs up, the data will be lost, while the data in persistent state will be lost.Data that has already fallen on the hard disk, such as mysql and mongodb, is stored on the hard disk. Even if the server hangs up, we can restart the service and still get the data. So the purpose of data persistence is to store the data in our memory in mysql or other databases.

What we use for data persistence sequelize , it can help us dock mysql and let us quickly CURD the data.

Here we create a new sequelize.js in the server/initialize folder to make it easier for us to connect when the project is initialized:

import Sequelize from 'sequelize';

let sequelize;

const defaultPreset = {
  host: 'localhost',
  dialect: 'mysql',
  operatorsAliases: false,
  port: 3306,
  pool: {
    max: 10,
    min: 0,
    acquire: 30000,
    idle: 10000
  }
};

export function initSequelize(config) {
  const { host, database, password, port, user } = config;
  sequelize = new Sequelize(database, user, password, Object.assign({}, defaultPreset, {
    host,
    port
  }));
  return sequelize;
};

export default sequelize;

initSequelize's entry config, derived from our config.properties, executes the connection before the project starts.

Then, we need to set up our Model for each table in the database, taking todolist as an example, in service/models, create a new file, ItemModel.js:

export default function(sequelize, DataTypes) {
    const Item = sequelize.define('Item', {
        recordId: {
            type: DataTypes.INTEGER,
            field: 'record_id',
            primaryKey: true
        },
        name: {
            type: DataTypes.STRING,
            field: 'name'
        },
        state: {
            type: DataTypes.INTEGER,
            field: 'state'
        }
    }, {
        tableName: 'item',
        timestamps: false
    });
    return Item;
}

Then, in service/models, create a new index.js to import all the models under the Models folder:

import fs from 'fs';
import path from 'path';
import Sequelize from 'sequelize';

const db = {};

export function installModel(sequelize) {
  fs.readdirSync(__dirname)
    .filter(file => (file.indexOf('.') !== 0 && file.slice(-3) === '.js' && file !== 'index.js'))
    .forEach((file) => {
      const model = sequelize.import(path.join(__dirname, file));
      db[model.name] = model;
    });
  Object.keys(db).forEach((modelName) => {
    if (db[modelName].associate) {
      db[modelName].associate(db);
    }
  });
  db.sequelize = sequelize;
  db.Sequelize = Sequelize;
}

export default db;

This installModel was also executed when our project was initialized.

Once the model has been initialized, we can define our Dao layer and use the model.

Dependent Injection

Dependent Injection (DI) is the most commonly used method of Inverse Control (IOC).Most of the beliefs we heard about this concept originated from Spring. Reverse control helps us create instances that we need most, not manually, and we don't need to care about the dependencies on which instances are created. IOC helps us manage them, greatly reducing the coupling between our code.

The dependency injection used here is awilix First we create the container, under server/container, create a new index.js:

import * as awilix from 'awilix';

const container = awilix.createContainer({
  injectionMode: awilix.InjectionMode.PROXY
});

export default container;

Then when our project is initialized, use awilix-express Initialize our backend router as follows:

import { loadControllers, scopePerRequest } from 'awilix-express';
import { Lifetime } from 'awilix';

const app = express();

app.use(scopePerRequest(container));

app.use('/api', loadControllers('api/*.js', {
  cwd: __dirname,
  lifetime: Lifetime.SINGLETON
}));

Then we can create a new controller under server/api, where we create a new TodoApi.js:

import { route, GET, POST } from 'awilix-express';

@route('/todo')
export default class TodoAPI {

  constructor({ todoService }) {
    this.todoService = todoService;
  }

  @route('/getTodolist')
  @GET()
  async getTodolist(req, res) {
    const [err, todolist] = await this.todoService.getList();
    if (err) {
      res.failPrint('Server side exception');
      return;
    }
    res.successPrint('query was successful', todolist);
  }

  //  ...
}

Here you can see that the constructor's input is injected into the todoService instance of the Service layer, which can then be used directly.

Next, we'll work out our Service and Dao layers, which is when the project is initialized, telling IOC all of our Service and Dao files:

import container from './container';
import { asClass } from 'awilix';

// Dependent Injection Configuration service Layer and dao Layer
container.loadModules(['services/*.js', 'daos/*.js'], {
  formatName: 'camelCase',
  register: asClass,
  cwd: path.resolve(__dirname)
});

Then we can create unscrupulously new service and dao files in the services and daos folders, where we create a new TodoService.js:


export default class TodoService {
  constructor({ itemDao }) {
    this.itemDao = itemDao;
  }

  async getList() {
    try {
      const list = await this.itemDao.getList();
      return [null, list];
    } catch (e) {
      console.error(e);
      return [new Error('Server side exception'), null];
    }
  }

  // ...
}

Then, create a new Dao, ItemDao.js, to dock the ItemModel, which is the Item table of mysql:

import BaseDao from './base';

export default class ItemDao extends BaseDao {
    
    modelName = 'Item';

    constructor(modules) {
      super(modules);
    }

    async getList() {
      return await this.findAll();
    }
}

Then do a BaseDao that encapsulates some of the common operations of the database. The code is too long to paste. See below for details Code Base.

About Transactions

The so-called transaction is simple and easy to understand. For example, we executed two SQL s to add two new data. When the first execution succeeded and the second failed, at this time we rolled back the transaction, then the first successful record will be cancelled.

Then, in order to satisfy the transaction, we can use the middleware as needed to inject the transaction into the request, so the QLs that are added or deleted under this request all use this transaction, such as the following middleware:

import { asValue } from 'awilix';

export default function () {
  return function (req, res, next) {
    const sequelize = container.resolve('sequelize');
    sequelize.transaction({  // Open Transaction
      autocommit: false
    }).then(t => {
      req.container = req.container.createScope(); // Create a new IOC container scope for the current request
      req.transaction = t;
      req.container.register({  // Inject a transaction for IOC
        transaction: asValue(t)
      });
      next();
    });
  }
}

Then when we need to commit a transaction, we can use IOC to inject the transaction, for example, we use transaction in TodoService.js


export default class TodoService {
  constructor({ itemDao, transaction }) {
    this.itemDao = itemDao;
    this.transaction = transaction;
  }

  async addItem(item) {
    // TODO: Add item data
    const success = await this.itemDao.addItem(item);
    if (success) {
      this.transaction.commit(); // Perform transaction commit
    } else {
      this.transaction.rollback(); // rollback the transaction
    }
  }

  // ...
}

Other

What happens when we need to use the current request object at the Service or Dao tiers, when we need to inject requests and response s for each request in the IOC, such as the following middleware:

import { asValue } from 'awilix';

export function baseMiddleware(app) {
  return (req, res, next) => {
    res.successPrint = (message, data) => res.json({ success: true, message, data });

    res.failPrint = (message, data) => res.json({ success: false, message, data });
    req.app = app;

    // Injection request, response
    req.container = req.container.createScope();
    req.container.register({
      request: asValue(req),
      response: asValue(res)
    });
    next();
  }
}

Then, when the project is initialized, use the middleware:

import express from 'express';

const app = express();
app.use(baseMiddleware(app));

About deployment

Use PM2 for simple deployment, create a new pm2.json at the project root

{
  "apps": [
    {
      "name": "vue-express",  // Instance name
      "script": "./dist/server/main.js",  // Startup File
      "log_date_format": "YYYY-MM-DD HH:mm Z",  // Log Date Folder Format
      "output": "./log/out.log",  // Other Logs
      "error": "./log/error.log", // error log
      "instances": "max",  // Number of Node instances started
      "watch": false, // Close File Listening Restart
      "merge_logs": true,
      "env": {
        "NODE_ENV": "production"
      }
    }
  ]
}

At this point, we need to compile the client and server into the dist directory, then point the server's static resource directory to the client directory, as follows:

app.use(express.static(path.resolve(__dirname, '../client')));

Add the vue-cli configuration file vue.config.js:

const path = require('path');
const clientPath = path.resolve(process.cwd(), './src/client');
module.exports = {
  configureWebpack: {
    entry: [
      path.resolve(clientPath, 'main.js')
    ],
    resolve: {
      alias: {
        '@': clientPath
      }
    },
    devServer: {
      proxy: {
        '/api': { // Development environment configures API prefix to backend port
          target: 'http://localhost:9001'
        }
      }
    }
  },
  outputDir: './dist/client/'
};

Add the following script to package.json:

{
  "script": {
    "clean": "rimraf dist",
    "pro-env": "cross-env NODE_ENV=production",
    "build:client": "vue-cli-service build",
    "build:server": "babel --config-file ./server.babel.config.js src/server --out-dir dist/server/",
    "build": "npm run clean && npm run build:client && npm run build:server",
    "start": "pm2 start pm2.json",
    "stop": "pm2 delete pm2.json"
  }
}

Execute the build command, clean the dist directory, compile the front and back end code into the dist directory, then npm run start, pm2 start dist/server/main.js;

So far, deployment is complete.

End

I found myself selling dog meat by hanging sheep's head. I was writing the back end.Okay, I admit I wanted to write back-end, but I still think that as a front-end engineer, Nodejs should be a necessary skill to get down this road. Go on.

Project Source Address

Posted by dkruythoff on Sat, 25 May 2019 12:02:08 -0700