Practice in JUnit for Selenium testing

Automated testing helps simplify the lives of software testers by allowing them to automate repetitive tasks, while open source test automation frameworks such as Selenium enable users to automate the Web test experience on a large scale But if you can't verify that the test case passed, what's the use of automated testing? This is a reflection ...

Posted by Pests on Thu, 07 Nov 2019 23:17:11 -0800

Scrapy reptile battle - crawling body color permutation 5 historical data

Website address: http://www.17500.cn/p5/all.php 1. Create a new crawler project scrapy startproject pfive 2. Create a new crawler in spiders directory scrapy genspider pfive_spider www.17500.cn 3. Modify the entry url in the crawler file start_urls = ['http://www.17500.cn/p5/all.php'] 4. Add crawling entry class PfiveItem ...

Posted by gid on Fri, 01 Nov 2019 04:32:47 -0700

python crawler - public comment - merchant comment

Grabbing content: public comments - Beijing - Haidian District - Gourmet Merchants - comments in recent three months 1. Required configuration: chrome browser, selenium package of python, selenium Start selenium, execute the following code, and then you can see a new chrome icon. It's a test browser, where you can log ...

Posted by bob_rock on Thu, 31 Oct 2019 06:45:14 -0700

selenium -- upload file

Foreplay In the process of web automation, the function of uploading files is often needed. selenium can upload files by using send_keys(), but there are great limitations in using send_keys(). Only input tags can be uploaded, and many tags can't be uploaded. Here we use the third-party module pywin32 to simulate uploading files.   actual co ...

Posted by DoD on Tue, 29 Oct 2019 12:52:26 -0700

Simply crawling the word cloud generated by the short comment of Douban in the movie clown

Introducer   I saw the Jekyll Phoenix movie a while ago, and I was very curious about what the audience would say about it after watching the movie. After watching the Douban short review, I thought that python would extract the most words in the short review and make a word cloud to see what the key words left for the audience. Grab da ...

Posted by boo_lolly on Wed, 23 Oct 2019 14:01:36 -0700

appium automation -- pageobject mode 02

Directory structure changes: 1. The Android client.py script has no changes: #AndroidClient.py from appium.webdriver.webdriver import WebDriver from appium import webdriver class AndroidClient(object): driver:WebDriver @classmethod def installApp(cls)->WebDriver: caps={} ...

Posted by hossfly007 on Fri, 18 Oct 2019 07:51:28 -0700

Introduction to Python selenium usage

# Edition python==3.7.3 selenium==4.0.0a1 # selenium pypi address https://pypi.org/project/selenium/ Catalog:I. Initialization2. Element Search3. select Label Operation4. Executing js scripts5. iframe operation6. Action and Action ChainVII. Abnormal handlingVIII. Withdrawal Procedure I. Initialization from selenium import webdriver fr ...

Posted by greenie__ on Mon, 14 Oct 2019 09:11:13 -0700

Climbing the information of millet

Explain Climbing millet is good: Put the two links together and run once to get them all (about 700) Using a combination of selenium+chrome+lxml (Quickly, because it's just one page) Output: The program generates three files, two CSVS and one xls csv is compact and versatile data_mi.csv uses utf-8 encoding. data_mi-gbk.csv uses GBK encoding ...

Posted by Jewbilee on Tue, 08 Oct 2019 23:18:24 -0700

springboot starts with a screen that resolves multithreaded [restarted Main]

Because the springboot project needs to add the function of startup waiting, that is, when the project is loaded and started, it needs to pop up a customized screen on the desktop, under which there is a progress bar. 1. Add file JWindows.java in the same place as Application import com.CIDataCompare.a ...

Posted by JC99 on Sat, 05 Oct 2019 20:43:59 -0700

In 10 minutes, we use python to realize the English name of a given movie and climb to the Chinese name and box office on the cat's eye.

& [root@xxn maoyan]# cat cat.py #!/usr/bin/env python #coding:utf-8 import requests from bs4 import BeautifulSoup def movieurl(url): """ //One-page url address for getting movies """ headers = { "User-Agent":"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, li ...

Posted by stukov on Sat, 05 Oct 2019 19:51:18 -0700