In the work, we often use excel to deal with data and display, but for some work we can use the help of programs to achieve efficient problem-solving effect.
For example, the json returned from the interface is parsed and saved in Excel, the data in excel is processed according to certain rules and then written into a new EXCEL file, and the co ...
Posted by 3r0ss on Sun, 24 May 2020 23:33:32 -0700
When we first came into contact with programming, we learned that there are two major ideas in the field of programming: process oriented and object-oriented. Python is an object-oriented language. If you have known the idea of object-oriented, naturally you know that the idea of object-oriented contains two basic concepts: ...
Posted by clairian on Sun, 24 May 2020 02:29:45 -0700
1, First, let's see how Python can simply crawl the web page1. Preparation The beautiful soup4 and chardet modules used in the project belong to the three-party extension package. If not, please install pip by yourself. I use pycharm to do the installation. Next, I will simply use pycharm to install chardet and beautiful soup4
Follow the steps ...
Posted by suspect on Sat, 23 May 2020 22:08:21 -0700
Previously, we used the keypoint of detectron 2_ RCNN trains its own data set to complete the detection of object feature points. This article is to turn the pytorch model into the form of c + + call. Effect:
Detectron 2 provides routines for converting the pytorch model to c + + calls: Deployment , gpu version model conversion requires Python ...
Posted by NArc0t1c on Sat, 23 May 2020 07:11:54 -0700
Source network, only for learning, if there is infringement, please contact delete.
If we can achieve report automation, we will save a lot of time and complete the work more efficiently. So, how to use python to realize report automation?
This article will introduce the common functions of xlwt, xlrd and xlutils, the application of formulas ...
Posted by JoeBuntu on Thu, 21 May 2020 23:13:05 -0700
Scratch is an application framework for crawling website data and extracting structural data. More details about the use of the framework can be found in the official documents. This article shows the general implementation process of crawling cartoon pictures.
Scrapy environment configuration
The first is the installation of scratch. The blogg ...
Posted by saloon12yrd on Wed, 20 May 2020 21:07:13 -0700
Let 's start with a brief introduction to the unittest framework
1. Structured folder:
Note that the above folders are all packages, that is, when you create a new folder in new, you should not select directory, but select package;
The first step is finished after the folder is built!
2. The second step is the idea of object-oriente ...
Posted by Kurtismonger on Tue, 19 May 2020 22:06:59 -0700
Based on Python's file, directory and path operations, we usually use os.path modular.
pathlib is its replacement in os.path On the basis of the encapsulation, the path is objectified, the api is more popular, the operation is more convenient, more in line with the thinking habits of programming.
The pathlib module provides classes that use sem ...
Posted by SuNcO on Tue, 19 May 2020 20:44:08 -0700
In Linux, to judge whether a PID corresponding process exists, you can judge it as follows
#! /usr/bin/env python
if __name__ == "__main__":
Returns True if the PID exists in the sy ...
Posted by banjax on Tue, 19 May 2020 07:52:33 -0700
First question: use the requests library to visit Baidu homepage 20 times, and return the length of his text and content attributes.
1 # -*- coding: utf-8 -*-
3 Created on Tue May 19 10:12:16 2020
5 @author: 49594
8 import requests
9 url = "https://www.baidu.com/"
10 for i in range(20):
12 rest ...
Posted by RDx321 on Tue, 19 May 2020 07:45:21 -0700