- Python Web Scraping Cookbook
- Michael Heydt
- 99字
- 2021-06-30 18:44:06
How it works
The boto3 library wraps the AWS S3 API in a Pythonic syntax. The .client() call authenticates with AWS and gives us an object to use to communicate with S3. Make sure you have your keys in environment variables, as otherwise this will not work.
The bucket name must be globally unique. At the time of writing, this bucket is available, but you will likely need to change the name. The .create_bucket() call creates the bucket and sets its ACL. put_object() uses the boto3 upload manager to upload the scraped data into the object in the bucket.
推薦閱讀
- Mastering Machine Learning for Penetration Testing
- OpenLayers Cookbook
- Proxmox High Availability
- 局域網(wǎng)組建、管理與維護(hù)項(xiàng)目教程(Windows Server 2003)
- 面向物聯(lián)網(wǎng)的嵌入式系統(tǒng)開(kāi)發(fā):基于CC2530和STM32微處理器
- 網(wǎng)絡(luò)的琴弦:玩轉(zhuǎn)IP看監(jiān)控
- Learning Swift(Second Edition)
- 網(wǎng)絡(luò)安全應(yīng)急響應(yīng)技術(shù)實(shí)戰(zhàn)
- Building RESTful Web Services with .NET Core
- Intelligent Mobile Projects with TensorFlow
- 區(qū)塊鏈社區(qū)運(yùn)營(yíng)手冊(cè)
- Hands-On Reactive Programming in Spring 5
- INSTANT Social Media Marketing with HootSuite
- OSPF協(xié)議原理與功能拓展
- 網(wǎng)絡(luò)互聯(lián)技術(shù)(理論篇)