Call Baidu Ai paddle platform API for estimating human flow

Keywords: github encoding network jupyter

Recently, the number of low-density population has been estimated, referring to a lot of information. First of all, I want to build a CSRnet for training by myself. The code implementation of this network can be referred to.

The paper and code are as follows. Interested parties can do it by themselves.
CVPR 2018 Paper : https://arxiv.org/abs/1802.10062

Keras implementation: https://github.com/DiaoXY/CSRnet

The official Pytorch implementation: https://github.com/leeyeehoo/CSRNet-pytorch
** Note: ** Keras code is special, the preprocessing phase is time-consuming, and 4g notebooks don't want to run out (like me)
The order of training and testing is:'Preprocess. ipynb (pretreatment)'- >'Model. ipynb (training parameters)' - >'Inference. ipynb (model testing)'.
Regression Method for Population Counting: According to the image of regression population density map, the number of people can be calculated directly.
Compared with Crowdnet, MCNN and SCNN, CSRnet is the best regression model to calculate crowding density.
Today, I happened to find that Baidu's Ai platform has traffic statistics. I used this interface for analysis and found that the effect is good. Personal feeling should be better than my own training CSRnet, and support dynamic images, such as business district statistics traffic.

Because of my ability to develop dishes, so I fumbled for an afternoon to come out. (Actually, Baidu's API documents really need to be updated. They're all 9102 years old.) Baidu Ai Reference Document
Perhaps the process is: registering Baidu Cloud Account - > Applying Related Applications (API) according to the Wizard - > After the application list is successful, you can see that client_id is the AK obtained for the official website, and client_secret is the SK obtained for the official website - > Next, you can open your local editor.
I use jupyter Notebook:

# encoding:utf-8
import base64
import urllib
import urllib3
from PIL import Image
import matplotlib.pyplot as plt

 ###Step 1: Get access_token
# client_id is the AK acquired by the official website, and client_secret is the SK acquired by the official website.
host = 'https://Aip. baidubce. com/oauth/2.0/token? Grant_type = client_credentials & client_id= *** personal *** & client_secret= *** personal *** & 'Note that there
request = urllib.request.Request(host)
request.add_header('Content-Type', 'application/json; charset=UTF-8')
response = urllib.request.urlopen(request)
content = response.read()
if (content):
    print(type(content))#<class 'bytes'>
content_str=str(content, encoding="utf-8")
content_dir = eval(content_str)
access_token = content_dir['access_token']
# '''
# The second step is to call API for traffic statistics
# '''
request_url = "https://aip.baidubce.com/rest/2.0/image-classify/v1/body_num"# Here I call the human flow statistics-body_num, pay attention to the distinction.
# # Opening Picture File in Binary Mode
path = 'testdata/IMG_160.jpg'
image = Image.open(path)
plt.figure("Original image")# Display the original picture
plt.imshow(image)
f = open(path,'rb')
img = base64.b64encode(f.read())
params = {"image":img} #Pay attention to Baidu's document is that you can use show=true to output pictures, the correct parameter format should be "show": "true"
params = urllib.parse.urlencode(params).encode("utf-8")  #Note pit encode('utf-8')
request_url = request_url + '?access_token=' + access_token
print(request_url)
request = urllib.request.Request(url=request_url, data=params)
request.add_header('Content-Type', 'application/x-www-form-urlencoded')
response = urllib.request.urlopen(request)
content = response.read().decode("utf-8")  #Pay attention to pit decode('utf-8'), don't ask me why I decode, it doesn't seem to work if I don't add it anyway.
if content:
    print(content)

Okay, and then the results show that the overall good, free calls per day for 50,000 times.
{"person_num": 111, "log_id": 2444277999205366182}
The real value is 119, CSRnet predicts 100, of course, this is an example, I tried several good.

Posted by john_nyc on Tue, 06 Aug 2019 03:42:54 -0700